Oct  8 12:45:38 np0005477492 kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct  8 12:45:38 np0005477492 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct  8 12:45:38 np0005477492 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 12:45:38 np0005477492 kernel: BIOS-provided physical RAM map:
Oct  8 12:45:38 np0005477492 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct  8 12:45:38 np0005477492 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct  8 12:45:38 np0005477492 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct  8 12:45:38 np0005477492 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct  8 12:45:38 np0005477492 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct  8 12:45:38 np0005477492 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct  8 12:45:38 np0005477492 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct  8 12:45:38 np0005477492 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct  8 12:45:38 np0005477492 kernel: NX (Execute Disable) protection: active
Oct  8 12:45:38 np0005477492 kernel: APIC: Static calls initialized
Oct  8 12:45:38 np0005477492 kernel: SMBIOS 2.8 present.
Oct  8 12:45:38 np0005477492 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct  8 12:45:38 np0005477492 kernel: Hypervisor detected: KVM
Oct  8 12:45:38 np0005477492 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct  8 12:45:38 np0005477492 kernel: kvm-clock: using sched offset of 4003147010 cycles
Oct  8 12:45:38 np0005477492 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct  8 12:45:38 np0005477492 kernel: tsc: Detected 2800.000 MHz processor
Oct  8 12:45:38 np0005477492 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct  8 12:45:38 np0005477492 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct  8 12:45:38 np0005477492 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct  8 12:45:38 np0005477492 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct  8 12:45:38 np0005477492 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct  8 12:45:38 np0005477492 kernel: Using GB pages for direct mapping
Oct  8 12:45:38 np0005477492 kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct  8 12:45:38 np0005477492 kernel: ACPI: Early table checksum verification disabled
Oct  8 12:45:38 np0005477492 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct  8 12:45:38 np0005477492 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 12:45:38 np0005477492 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 12:45:38 np0005477492 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 12:45:38 np0005477492 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct  8 12:45:38 np0005477492 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 12:45:38 np0005477492 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 12:45:38 np0005477492 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct  8 12:45:38 np0005477492 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct  8 12:45:38 np0005477492 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct  8 12:45:38 np0005477492 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct  8 12:45:38 np0005477492 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct  8 12:45:38 np0005477492 kernel: No NUMA configuration found
Oct  8 12:45:38 np0005477492 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct  8 12:45:38 np0005477492 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct  8 12:45:38 np0005477492 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct  8 12:45:38 np0005477492 kernel: Zone ranges:
Oct  8 12:45:38 np0005477492 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct  8 12:45:38 np0005477492 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct  8 12:45:38 np0005477492 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct  8 12:45:38 np0005477492 kernel:  Device   empty
Oct  8 12:45:38 np0005477492 kernel: Movable zone start for each node
Oct  8 12:45:38 np0005477492 kernel: Early memory node ranges
Oct  8 12:45:38 np0005477492 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct  8 12:45:38 np0005477492 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct  8 12:45:38 np0005477492 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct  8 12:45:38 np0005477492 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct  8 12:45:38 np0005477492 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct  8 12:45:38 np0005477492 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct  8 12:45:38 np0005477492 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct  8 12:45:38 np0005477492 kernel: ACPI: PM-Timer IO Port: 0x608
Oct  8 12:45:38 np0005477492 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct  8 12:45:38 np0005477492 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct  8 12:45:38 np0005477492 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct  8 12:45:38 np0005477492 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct  8 12:45:38 np0005477492 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct  8 12:45:38 np0005477492 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct  8 12:45:38 np0005477492 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct  8 12:45:38 np0005477492 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct  8 12:45:38 np0005477492 kernel: TSC deadline timer available
Oct  8 12:45:38 np0005477492 kernel: CPU topo: Max. logical packages:   8
Oct  8 12:45:38 np0005477492 kernel: CPU topo: Max. logical dies:       8
Oct  8 12:45:38 np0005477492 kernel: CPU topo: Max. dies per package:   1
Oct  8 12:45:38 np0005477492 kernel: CPU topo: Max. threads per core:   1
Oct  8 12:45:38 np0005477492 kernel: CPU topo: Num. cores per package:     1
Oct  8 12:45:38 np0005477492 kernel: CPU topo: Num. threads per package:   1
Oct  8 12:45:38 np0005477492 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct  8 12:45:38 np0005477492 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct  8 12:45:38 np0005477492 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct  8 12:45:38 np0005477492 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct  8 12:45:38 np0005477492 kernel: Booting paravirtualized kernel on KVM
Oct  8 12:45:38 np0005477492 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct  8 12:45:38 np0005477492 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct  8 12:45:38 np0005477492 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct  8 12:45:38 np0005477492 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct  8 12:45:38 np0005477492 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 12:45:38 np0005477492 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct  8 12:45:38 np0005477492 kernel: random: crng init done
Oct  8 12:45:38 np0005477492 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: Fallback order for Node 0: 0 
Oct  8 12:45:38 np0005477492 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct  8 12:45:38 np0005477492 kernel: Policy zone: Normal
Oct  8 12:45:38 np0005477492 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct  8 12:45:38 np0005477492 kernel: software IO TLB: area num 8.
Oct  8 12:45:38 np0005477492 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct  8 12:45:38 np0005477492 kernel: ftrace: allocating 49370 entries in 193 pages
Oct  8 12:45:38 np0005477492 kernel: ftrace: allocated 193 pages with 3 groups
Oct  8 12:45:38 np0005477492 kernel: Dynamic Preempt: voluntary
Oct  8 12:45:38 np0005477492 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct  8 12:45:38 np0005477492 kernel: rcu: #011RCU event tracing is enabled.
Oct  8 12:45:38 np0005477492 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct  8 12:45:38 np0005477492 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct  8 12:45:38 np0005477492 kernel: #011Rude variant of Tasks RCU enabled.
Oct  8 12:45:38 np0005477492 kernel: #011Tracing variant of Tasks RCU enabled.
Oct  8 12:45:38 np0005477492 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct  8 12:45:38 np0005477492 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct  8 12:45:38 np0005477492 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 12:45:38 np0005477492 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 12:45:38 np0005477492 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 12:45:38 np0005477492 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct  8 12:45:38 np0005477492 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct  8 12:45:38 np0005477492 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct  8 12:45:38 np0005477492 kernel: Console: colour VGA+ 80x25
Oct  8 12:45:38 np0005477492 kernel: printk: console [ttyS0] enabled
Oct  8 12:45:38 np0005477492 kernel: ACPI: Core revision 20230331
Oct  8 12:45:38 np0005477492 kernel: APIC: Switch to symmetric I/O mode setup
Oct  8 12:45:38 np0005477492 kernel: x2apic enabled
Oct  8 12:45:38 np0005477492 kernel: APIC: Switched APIC routing to: physical x2apic
Oct  8 12:45:38 np0005477492 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct  8 12:45:38 np0005477492 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct  8 12:45:38 np0005477492 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct  8 12:45:38 np0005477492 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct  8 12:45:38 np0005477492 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct  8 12:45:38 np0005477492 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct  8 12:45:38 np0005477492 kernel: Spectre V2 : Mitigation: Retpolines
Oct  8 12:45:38 np0005477492 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct  8 12:45:38 np0005477492 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct  8 12:45:38 np0005477492 kernel: RETBleed: Mitigation: untrained return thunk
Oct  8 12:45:38 np0005477492 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct  8 12:45:38 np0005477492 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct  8 12:45:38 np0005477492 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct  8 12:45:38 np0005477492 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct  8 12:45:38 np0005477492 kernel: x86/bugs: return thunk changed
Oct  8 12:45:38 np0005477492 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct  8 12:45:38 np0005477492 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct  8 12:45:38 np0005477492 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct  8 12:45:38 np0005477492 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct  8 12:45:38 np0005477492 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct  8 12:45:38 np0005477492 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct  8 12:45:38 np0005477492 kernel: Freeing SMP alternatives memory: 40K
Oct  8 12:45:38 np0005477492 kernel: pid_max: default: 32768 minimum: 301
Oct  8 12:45:38 np0005477492 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct  8 12:45:38 np0005477492 kernel: landlock: Up and running.
Oct  8 12:45:38 np0005477492 kernel: Yama: becoming mindful.
Oct  8 12:45:38 np0005477492 kernel: SELinux:  Initializing.
Oct  8 12:45:38 np0005477492 kernel: LSM support for eBPF active
Oct  8 12:45:38 np0005477492 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct  8 12:45:38 np0005477492 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct  8 12:45:38 np0005477492 kernel: ... version:                0
Oct  8 12:45:38 np0005477492 kernel: ... bit width:              48
Oct  8 12:45:38 np0005477492 kernel: ... generic registers:      6
Oct  8 12:45:38 np0005477492 kernel: ... value mask:             0000ffffffffffff
Oct  8 12:45:38 np0005477492 kernel: ... max period:             00007fffffffffff
Oct  8 12:45:38 np0005477492 kernel: ... fixed-purpose events:   0
Oct  8 12:45:38 np0005477492 kernel: ... event mask:             000000000000003f
Oct  8 12:45:38 np0005477492 kernel: signal: max sigframe size: 1776
Oct  8 12:45:38 np0005477492 kernel: rcu: Hierarchical SRCU implementation.
Oct  8 12:45:38 np0005477492 kernel: rcu: #011Max phase no-delay instances is 400.
Oct  8 12:45:38 np0005477492 kernel: smp: Bringing up secondary CPUs ...
Oct  8 12:45:38 np0005477492 kernel: smpboot: x86: Booting SMP configuration:
Oct  8 12:45:38 np0005477492 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct  8 12:45:38 np0005477492 kernel: smp: Brought up 1 node, 8 CPUs
Oct  8 12:45:38 np0005477492 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct  8 12:45:38 np0005477492 kernel: node 0 deferred pages initialised in 33ms
Oct  8 12:45:38 np0005477492 kernel: Memory: 7765688K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616504K reserved, 0K cma-reserved)
Oct  8 12:45:38 np0005477492 kernel: devtmpfs: initialized
Oct  8 12:45:38 np0005477492 kernel: x86/mm: Memory block size: 128MB
Oct  8 12:45:38 np0005477492 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct  8 12:45:38 np0005477492 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: pinctrl core: initialized pinctrl subsystem
Oct  8 12:45:38 np0005477492 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct  8 12:45:38 np0005477492 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct  8 12:45:38 np0005477492 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct  8 12:45:38 np0005477492 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct  8 12:45:38 np0005477492 kernel: audit: initializing netlink subsys (disabled)
Oct  8 12:45:38 np0005477492 kernel: audit: type=2000 audit(1759941937.384:1): state=initialized audit_enabled=0 res=1
Oct  8 12:45:38 np0005477492 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct  8 12:45:38 np0005477492 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct  8 12:45:38 np0005477492 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct  8 12:45:38 np0005477492 kernel: cpuidle: using governor menu
Oct  8 12:45:38 np0005477492 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct  8 12:45:38 np0005477492 kernel: PCI: Using configuration type 1 for base access
Oct  8 12:45:38 np0005477492 kernel: PCI: Using configuration type 1 for extended access
Oct  8 12:45:38 np0005477492 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct  8 12:45:38 np0005477492 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct  8 12:45:38 np0005477492 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct  8 12:45:38 np0005477492 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct  8 12:45:38 np0005477492 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct  8 12:45:38 np0005477492 kernel: Demotion targets for Node 0: null
Oct  8 12:45:38 np0005477492 kernel: cryptd: max_cpu_qlen set to 1000
Oct  8 12:45:38 np0005477492 kernel: ACPI: Added _OSI(Module Device)
Oct  8 12:45:38 np0005477492 kernel: ACPI: Added _OSI(Processor Device)
Oct  8 12:45:38 np0005477492 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct  8 12:45:38 np0005477492 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct  8 12:45:38 np0005477492 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct  8 12:45:38 np0005477492 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct  8 12:45:38 np0005477492 kernel: ACPI: Interpreter enabled
Oct  8 12:45:38 np0005477492 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct  8 12:45:38 np0005477492 kernel: ACPI: Using IOAPIC for interrupt routing
Oct  8 12:45:38 np0005477492 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct  8 12:45:38 np0005477492 kernel: PCI: Using E820 reservations for host bridge windows
Oct  8 12:45:38 np0005477492 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct  8 12:45:38 np0005477492 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct  8 12:45:38 np0005477492 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [3] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [4] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [5] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [6] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [7] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [8] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [9] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [10] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [11] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [12] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [13] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [14] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [15] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [16] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [17] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [18] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [19] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [20] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [21] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [22] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [23] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [24] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [25] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [26] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [27] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [28] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [29] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [30] registered
Oct  8 12:45:38 np0005477492 kernel: acpiphp: Slot [31] registered
Oct  8 12:45:38 np0005477492 kernel: PCI host bridge to bus 0000:00
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct  8 12:45:38 np0005477492 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct  8 12:45:38 np0005477492 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct  8 12:45:38 np0005477492 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct  8 12:45:38 np0005477492 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct  8 12:45:38 np0005477492 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct  8 12:45:38 np0005477492 kernel: iommu: Default domain type: Translated
Oct  8 12:45:38 np0005477492 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct  8 12:45:38 np0005477492 kernel: SCSI subsystem initialized
Oct  8 12:45:38 np0005477492 kernel: ACPI: bus type USB registered
Oct  8 12:45:38 np0005477492 kernel: usbcore: registered new interface driver usbfs
Oct  8 12:45:38 np0005477492 kernel: usbcore: registered new interface driver hub
Oct  8 12:45:38 np0005477492 kernel: usbcore: registered new device driver usb
Oct  8 12:45:38 np0005477492 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct  8 12:45:38 np0005477492 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct  8 12:45:38 np0005477492 kernel: PTP clock support registered
Oct  8 12:45:38 np0005477492 kernel: EDAC MC: Ver: 3.0.0
Oct  8 12:45:38 np0005477492 kernel: NetLabel: Initializing
Oct  8 12:45:38 np0005477492 kernel: NetLabel:  domain hash size = 128
Oct  8 12:45:38 np0005477492 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct  8 12:45:38 np0005477492 kernel: NetLabel:  unlabeled traffic allowed by default
Oct  8 12:45:38 np0005477492 kernel: PCI: Using ACPI for IRQ routing
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct  8 12:45:38 np0005477492 kernel: vgaarb: loaded
Oct  8 12:45:38 np0005477492 kernel: clocksource: Switched to clocksource kvm-clock
Oct  8 12:45:38 np0005477492 kernel: VFS: Disk quotas dquot_6.6.0
Oct  8 12:45:38 np0005477492 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct  8 12:45:38 np0005477492 kernel: pnp: PnP ACPI init
Oct  8 12:45:38 np0005477492 kernel: pnp: PnP ACPI: found 5 devices
Oct  8 12:45:38 np0005477492 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct  8 12:45:38 np0005477492 kernel: NET: Registered PF_INET protocol family
Oct  8 12:45:38 np0005477492 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct  8 12:45:38 np0005477492 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  8 12:45:38 np0005477492 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct  8 12:45:38 np0005477492 kernel: NET: Registered PF_XDP protocol family
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct  8 12:45:38 np0005477492 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct  8 12:45:38 np0005477492 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct  8 12:45:38 np0005477492 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 74115 usecs
Oct  8 12:45:38 np0005477492 kernel: PCI: CLS 0 bytes, default 64
Oct  8 12:45:38 np0005477492 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct  8 12:45:38 np0005477492 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct  8 12:45:38 np0005477492 kernel: Trying to unpack rootfs image as initramfs...
Oct  8 12:45:38 np0005477492 kernel: ACPI: bus type thunderbolt registered
Oct  8 12:45:38 np0005477492 kernel: Initialise system trusted keyrings
Oct  8 12:45:38 np0005477492 kernel: Key type blacklist registered
Oct  8 12:45:38 np0005477492 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct  8 12:45:38 np0005477492 kernel: zbud: loaded
Oct  8 12:45:38 np0005477492 kernel: integrity: Platform Keyring initialized
Oct  8 12:45:38 np0005477492 kernel: integrity: Machine keyring initialized
Oct  8 12:45:38 np0005477492 kernel: Freeing initrd memory: 86104K
Oct  8 12:45:38 np0005477492 kernel: NET: Registered PF_ALG protocol family
Oct  8 12:45:38 np0005477492 kernel: xor: automatically using best checksumming function   avx       
Oct  8 12:45:38 np0005477492 kernel: Key type asymmetric registered
Oct  8 12:45:38 np0005477492 kernel: Asymmetric key parser 'x509' registered
Oct  8 12:45:38 np0005477492 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct  8 12:45:38 np0005477492 kernel: io scheduler mq-deadline registered
Oct  8 12:45:38 np0005477492 kernel: io scheduler kyber registered
Oct  8 12:45:38 np0005477492 kernel: io scheduler bfq registered
Oct  8 12:45:38 np0005477492 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct  8 12:45:38 np0005477492 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct  8 12:45:38 np0005477492 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct  8 12:45:38 np0005477492 kernel: ACPI: button: Power Button [PWRF]
Oct  8 12:45:38 np0005477492 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct  8 12:45:38 np0005477492 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct  8 12:45:38 np0005477492 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct  8 12:45:38 np0005477492 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct  8 12:45:38 np0005477492 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct  8 12:45:38 np0005477492 kernel: Non-volatile memory driver v1.3
Oct  8 12:45:38 np0005477492 kernel: rdac: device handler registered
Oct  8 12:45:38 np0005477492 kernel: hp_sw: device handler registered
Oct  8 12:45:38 np0005477492 kernel: emc: device handler registered
Oct  8 12:45:38 np0005477492 kernel: alua: device handler registered
Oct  8 12:45:38 np0005477492 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct  8 12:45:38 np0005477492 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct  8 12:45:38 np0005477492 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct  8 12:45:38 np0005477492 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct  8 12:45:38 np0005477492 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct  8 12:45:38 np0005477492 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct  8 12:45:38 np0005477492 kernel: usb usb1: Product: UHCI Host Controller
Oct  8 12:45:38 np0005477492 kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct  8 12:45:38 np0005477492 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct  8 12:45:38 np0005477492 kernel: hub 1-0:1.0: USB hub found
Oct  8 12:45:38 np0005477492 kernel: hub 1-0:1.0: 2 ports detected
Oct  8 12:45:38 np0005477492 kernel: usbcore: registered new interface driver usbserial_generic
Oct  8 12:45:38 np0005477492 kernel: usbserial: USB Serial support registered for generic
Oct  8 12:45:38 np0005477492 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct  8 12:45:38 np0005477492 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct  8 12:45:38 np0005477492 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct  8 12:45:38 np0005477492 kernel: mousedev: PS/2 mouse device common for all mice
Oct  8 12:45:38 np0005477492 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct  8 12:45:38 np0005477492 kernel: rtc_cmos 00:04: registered as rtc0
Oct  8 12:45:38 np0005477492 kernel: rtc_cmos 00:04: setting system clock to 2025-10-08T16:45:37 UTC (1759941937)
Oct  8 12:45:38 np0005477492 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct  8 12:45:38 np0005477492 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct  8 12:45:38 np0005477492 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct  8 12:45:38 np0005477492 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct  8 12:45:38 np0005477492 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct  8 12:45:38 np0005477492 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct  8 12:45:38 np0005477492 kernel: usbcore: registered new interface driver usbhid
Oct  8 12:45:38 np0005477492 kernel: usbhid: USB HID core driver
Oct  8 12:45:38 np0005477492 kernel: drop_monitor: Initializing network drop monitor service
Oct  8 12:45:38 np0005477492 kernel: Initializing XFRM netlink socket
Oct  8 12:45:38 np0005477492 kernel: NET: Registered PF_INET6 protocol family
Oct  8 12:45:38 np0005477492 kernel: Segment Routing with IPv6
Oct  8 12:45:38 np0005477492 kernel: NET: Registered PF_PACKET protocol family
Oct  8 12:45:38 np0005477492 kernel: mpls_gso: MPLS GSO support
Oct  8 12:45:38 np0005477492 kernel: IPI shorthand broadcast: enabled
Oct  8 12:45:38 np0005477492 kernel: AVX2 version of gcm_enc/dec engaged.
Oct  8 12:45:38 np0005477492 kernel: AES CTR mode by8 optimization enabled
Oct  8 12:45:38 np0005477492 kernel: sched_clock: Marking stable (1231007260, 145106650)->(1490467890, -114353980)
Oct  8 12:45:38 np0005477492 kernel: registered taskstats version 1
Oct  8 12:45:38 np0005477492 kernel: Loading compiled-in X.509 certificates
Oct  8 12:45:38 np0005477492 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  8 12:45:38 np0005477492 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct  8 12:45:38 np0005477492 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct  8 12:45:38 np0005477492 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct  8 12:45:38 np0005477492 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct  8 12:45:38 np0005477492 kernel: Demotion targets for Node 0: null
Oct  8 12:45:38 np0005477492 kernel: page_owner is disabled
Oct  8 12:45:38 np0005477492 kernel: Key type .fscrypt registered
Oct  8 12:45:38 np0005477492 kernel: Key type fscrypt-provisioning registered
Oct  8 12:45:38 np0005477492 kernel: Key type big_key registered
Oct  8 12:45:38 np0005477492 kernel: Key type encrypted registered
Oct  8 12:45:38 np0005477492 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct  8 12:45:38 np0005477492 kernel: Loading compiled-in module X.509 certificates
Oct  8 12:45:38 np0005477492 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  8 12:45:38 np0005477492 kernel: ima: Allocated hash algorithm: sha256
Oct  8 12:45:38 np0005477492 kernel: ima: No architecture policies found
Oct  8 12:45:38 np0005477492 kernel: evm: Initialising EVM extended attributes:
Oct  8 12:45:38 np0005477492 kernel: evm: security.selinux
Oct  8 12:45:38 np0005477492 kernel: evm: security.SMACK64 (disabled)
Oct  8 12:45:38 np0005477492 kernel: evm: security.SMACK64EXEC (disabled)
Oct  8 12:45:38 np0005477492 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct  8 12:45:38 np0005477492 kernel: evm: security.SMACK64MMAP (disabled)
Oct  8 12:45:38 np0005477492 kernel: evm: security.apparmor (disabled)
Oct  8 12:45:38 np0005477492 kernel: evm: security.ima
Oct  8 12:45:38 np0005477492 kernel: evm: security.capability
Oct  8 12:45:38 np0005477492 kernel: evm: HMAC attrs: 0x1
Oct  8 12:45:38 np0005477492 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct  8 12:45:38 np0005477492 kernel: Running certificate verification RSA selftest
Oct  8 12:45:38 np0005477492 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct  8 12:45:38 np0005477492 kernel: Running certificate verification ECDSA selftest
Oct  8 12:45:38 np0005477492 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct  8 12:45:38 np0005477492 kernel: clk: Disabling unused clocks
Oct  8 12:45:38 np0005477492 kernel: Freeing unused decrypted memory: 2028K
Oct  8 12:45:38 np0005477492 kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct  8 12:45:38 np0005477492 kernel: Write protecting the kernel read-only data: 30720k
Oct  8 12:45:38 np0005477492 kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct  8 12:45:38 np0005477492 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct  8 12:45:38 np0005477492 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct  8 12:45:38 np0005477492 kernel: usb 1-1: Product: QEMU USB Tablet
Oct  8 12:45:38 np0005477492 kernel: usb 1-1: Manufacturer: QEMU
Oct  8 12:45:38 np0005477492 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct  8 12:45:38 np0005477492 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct  8 12:45:38 np0005477492 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct  8 12:45:38 np0005477492 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct  8 12:45:38 np0005477492 kernel: Run /init as init process
Oct  8 12:45:38 np0005477492 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  8 12:45:38 np0005477492 systemd: Detected virtualization kvm.
Oct  8 12:45:38 np0005477492 systemd: Detected architecture x86-64.
Oct  8 12:45:38 np0005477492 systemd: Running in initrd.
Oct  8 12:45:38 np0005477492 systemd: No hostname configured, using default hostname.
Oct  8 12:45:38 np0005477492 systemd: Hostname set to <localhost>.
Oct  8 12:45:38 np0005477492 systemd: Initializing machine ID from VM UUID.
Oct  8 12:45:38 np0005477492 systemd: Queued start job for default target Initrd Default Target.
Oct  8 12:45:38 np0005477492 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  8 12:45:38 np0005477492 systemd: Reached target Local Encrypted Volumes.
Oct  8 12:45:38 np0005477492 systemd: Reached target Initrd /usr File System.
Oct  8 12:45:38 np0005477492 systemd: Reached target Local File Systems.
Oct  8 12:45:38 np0005477492 systemd: Reached target Path Units.
Oct  8 12:45:38 np0005477492 systemd: Reached target Slice Units.
Oct  8 12:45:38 np0005477492 systemd: Reached target Swaps.
Oct  8 12:45:38 np0005477492 systemd: Reached target Timer Units.
Oct  8 12:45:38 np0005477492 systemd: Listening on D-Bus System Message Bus Socket.
Oct  8 12:45:38 np0005477492 systemd: Listening on Journal Socket (/dev/log).
Oct  8 12:45:38 np0005477492 systemd: Listening on Journal Socket.
Oct  8 12:45:38 np0005477492 systemd: Listening on udev Control Socket.
Oct  8 12:45:38 np0005477492 systemd: Listening on udev Kernel Socket.
Oct  8 12:45:38 np0005477492 systemd: Reached target Socket Units.
Oct  8 12:45:38 np0005477492 systemd: Starting Create List of Static Device Nodes...
Oct  8 12:45:38 np0005477492 systemd: Starting Journal Service...
Oct  8 12:45:38 np0005477492 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  8 12:45:38 np0005477492 systemd: Starting Apply Kernel Variables...
Oct  8 12:45:38 np0005477492 systemd: Starting Create System Users...
Oct  8 12:45:38 np0005477492 systemd: Starting Setup Virtual Console...
Oct  8 12:45:38 np0005477492 systemd: Finished Create List of Static Device Nodes.
Oct  8 12:45:38 np0005477492 systemd: Finished Apply Kernel Variables.
Oct  8 12:45:38 np0005477492 systemd-journald[307]: Journal started
Oct  8 12:45:38 np0005477492 systemd-journald[307]: Runtime Journal (/run/log/journal/9ff32318d7e04b37bb6eea4cfd795672) is 8.0M, max 153.5M, 145.5M free.
Oct  8 12:45:38 np0005477492 systemd: Started Journal Service.
Oct  8 12:45:38 np0005477492 systemd-sysusers[312]: Creating group 'users' with GID 100.
Oct  8 12:45:38 np0005477492 systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Oct  8 12:45:38 np0005477492 systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct  8 12:45:38 np0005477492 systemd[1]: Finished Create System Users.
Oct  8 12:45:38 np0005477492 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  8 12:45:38 np0005477492 systemd[1]: Starting Create Volatile Files and Directories...
Oct  8 12:45:38 np0005477492 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  8 12:45:38 np0005477492 systemd[1]: Finished Create Volatile Files and Directories.
Oct  8 12:45:38 np0005477492 systemd[1]: Finished Setup Virtual Console.
Oct  8 12:45:38 np0005477492 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct  8 12:45:38 np0005477492 systemd[1]: Starting dracut cmdline hook...
Oct  8 12:45:38 np0005477492 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Oct  8 12:45:38 np0005477492 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 12:45:38 np0005477492 systemd[1]: Finished dracut cmdline hook.
Oct  8 12:45:38 np0005477492 systemd[1]: Starting dracut pre-udev hook...
Oct  8 12:45:38 np0005477492 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct  8 12:45:38 np0005477492 kernel: device-mapper: uevent: version 1.0.3
Oct  8 12:45:38 np0005477492 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct  8 12:45:38 np0005477492 kernel: RPC: Registered named UNIX socket transport module.
Oct  8 12:45:38 np0005477492 kernel: RPC: Registered udp transport module.
Oct  8 12:45:38 np0005477492 kernel: RPC: Registered tcp transport module.
Oct  8 12:45:38 np0005477492 kernel: RPC: Registered tcp-with-tls transport module.
Oct  8 12:45:38 np0005477492 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct  8 12:45:38 np0005477492 rpc.statd[444]: Version 2.5.4 starting
Oct  8 12:45:38 np0005477492 rpc.statd[444]: Initializing NSM state
Oct  8 12:45:38 np0005477492 rpc.idmapd[449]: Setting log level to 0
Oct  8 12:45:38 np0005477492 systemd[1]: Finished dracut pre-udev hook.
Oct  8 12:45:38 np0005477492 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  8 12:45:38 np0005477492 systemd-udevd[462]: Using default interface naming scheme 'rhel-9.0'.
Oct  8 12:45:38 np0005477492 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  8 12:45:38 np0005477492 systemd[1]: Starting dracut pre-trigger hook...
Oct  8 12:45:38 np0005477492 systemd[1]: Finished dracut pre-trigger hook.
Oct  8 12:45:38 np0005477492 systemd[1]: Starting Coldplug All udev Devices...
Oct  8 12:45:39 np0005477492 systemd[1]: Created slice Slice /system/modprobe.
Oct  8 12:45:39 np0005477492 systemd[1]: Starting Load Kernel Module configfs...
Oct  8 12:45:39 np0005477492 systemd[1]: Finished Coldplug All udev Devices.
Oct  8 12:45:39 np0005477492 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  8 12:45:39 np0005477492 systemd[1]: Finished Load Kernel Module configfs.
Oct  8 12:45:39 np0005477492 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  8 12:45:39 np0005477492 systemd[1]: Reached target Network.
Oct  8 12:45:39 np0005477492 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  8 12:45:39 np0005477492 systemd[1]: Starting dracut initqueue hook...
Oct  8 12:45:39 np0005477492 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct  8 12:45:39 np0005477492 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct  8 12:45:39 np0005477492 kernel: vda: vda1
Oct  8 12:45:39 np0005477492 kernel: scsi host0: ata_piix
Oct  8 12:45:39 np0005477492 kernel: scsi host1: ata_piix
Oct  8 12:45:39 np0005477492 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct  8 12:45:39 np0005477492 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct  8 12:45:39 np0005477492 systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  8 12:45:39 np0005477492 systemd[1]: Reached target Initrd Root Device.
Oct  8 12:45:39 np0005477492 systemd[1]: Mounting Kernel Configuration File System...
Oct  8 12:45:39 np0005477492 systemd[1]: Mounted Kernel Configuration File System.
Oct  8 12:45:39 np0005477492 systemd[1]: Reached target System Initialization.
Oct  8 12:45:39 np0005477492 systemd[1]: Reached target Basic System.
Oct  8 12:45:39 np0005477492 kernel: ata1: found unknown device (class 0)
Oct  8 12:45:39 np0005477492 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct  8 12:45:39 np0005477492 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct  8 12:45:39 np0005477492 systemd-udevd[508]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:45:39 np0005477492 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct  8 12:45:39 np0005477492 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct  8 12:45:39 np0005477492 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct  8 12:45:39 np0005477492 systemd[1]: Finished dracut initqueue hook.
Oct  8 12:45:39 np0005477492 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  8 12:45:39 np0005477492 systemd[1]: Reached target Remote Encrypted Volumes.
Oct  8 12:45:39 np0005477492 systemd[1]: Reached target Remote File Systems.
Oct  8 12:45:39 np0005477492 systemd[1]: Starting dracut pre-mount hook...
Oct  8 12:45:39 np0005477492 systemd[1]: Finished dracut pre-mount hook.
Oct  8 12:45:39 np0005477492 systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct  8 12:45:39 np0005477492 systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Oct  8 12:45:39 np0005477492 systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  8 12:45:39 np0005477492 systemd[1]: Mounting /sysroot...
Oct  8 12:45:40 np0005477492 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct  8 12:45:40 np0005477492 kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct  8 12:45:40 np0005477492 kernel: XFS (vda1): Ending clean mount
Oct  8 12:45:40 np0005477492 systemd[1]: Mounted /sysroot.
Oct  8 12:45:40 np0005477492 systemd[1]: Reached target Initrd Root File System.
Oct  8 12:45:40 np0005477492 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct  8 12:45:40 np0005477492 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct  8 12:45:40 np0005477492 systemd[1]: Reached target Initrd File Systems.
Oct  8 12:45:40 np0005477492 systemd[1]: Reached target Initrd Default Target.
Oct  8 12:45:40 np0005477492 systemd[1]: Starting dracut mount hook...
Oct  8 12:45:40 np0005477492 systemd[1]: Finished dracut mount hook.
Oct  8 12:45:40 np0005477492 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct  8 12:45:40 np0005477492 rpc.idmapd[449]: exiting on signal 15
Oct  8 12:45:40 np0005477492 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct  8 12:45:40 np0005477492 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Network.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Timer Units.
Oct  8 12:45:40 np0005477492 systemd[1]: dbus.socket: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct  8 12:45:40 np0005477492 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Initrd Default Target.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Basic System.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Initrd Root Device.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Initrd /usr File System.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Path Units.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Remote File Systems.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Slice Units.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Socket Units.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target System Initialization.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Local File Systems.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Swaps.
Oct  8 12:45:40 np0005477492 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped dracut mount hook.
Oct  8 12:45:40 np0005477492 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped dracut pre-mount hook.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped target Local Encrypted Volumes.
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct  8 12:45:40 np0005477492 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped dracut initqueue hook.
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Apply Kernel Variables.
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Create Volatile Files and Directories.
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Coldplug All udev Devices.
Oct  8 12:45:40 np0005477492 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped dracut pre-trigger hook.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Setup Virtual Console.
Oct  8 12:45:40 np0005477492 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Closed udev Control Socket.
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Closed udev Kernel Socket.
Oct  8 12:45:40 np0005477492 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped dracut pre-udev hook.
Oct  8 12:45:40 np0005477492 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped dracut cmdline hook.
Oct  8 12:45:40 np0005477492 systemd[1]: Starting Cleanup udev Database...
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct  8 12:45:40 np0005477492 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Create List of Static Device Nodes.
Oct  8 12:45:40 np0005477492 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Stopped Create System Users.
Oct  8 12:45:40 np0005477492 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct  8 12:45:40 np0005477492 systemd[1]: Finished Cleanup udev Database.
Oct  8 12:45:40 np0005477492 systemd[1]: Reached target Switch Root.
Oct  8 12:45:40 np0005477492 systemd[1]: Starting Switch Root...
Oct  8 12:45:40 np0005477492 systemd[1]: Switching root.
Oct  8 12:45:40 np0005477492 systemd-journald[307]: Journal stopped
Oct  8 12:45:41 np0005477492 systemd-journald: Received SIGTERM from PID 1 (systemd).
Oct  8 12:45:41 np0005477492 kernel: audit: type=1404 audit(1759941940.548:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct  8 12:45:41 np0005477492 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 12:45:41 np0005477492 kernel: SELinux:  policy capability open_perms=1
Oct  8 12:45:41 np0005477492 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 12:45:41 np0005477492 kernel: SELinux:  policy capability always_check_network=0
Oct  8 12:45:41 np0005477492 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 12:45:41 np0005477492 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 12:45:41 np0005477492 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 12:45:41 np0005477492 kernel: audit: type=1403 audit(1759941940.700:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct  8 12:45:41 np0005477492 systemd: Successfully loaded SELinux policy in 157.471ms.
Oct  8 12:45:41 np0005477492 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.624ms.
Oct  8 12:45:41 np0005477492 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  8 12:45:41 np0005477492 systemd: Detected virtualization kvm.
Oct  8 12:45:41 np0005477492 systemd: Detected architecture x86-64.
Oct  8 12:45:41 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 12:45:41 np0005477492 systemd: initrd-switch-root.service: Deactivated successfully.
Oct  8 12:45:41 np0005477492 systemd: Stopped Switch Root.
Oct  8 12:45:41 np0005477492 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct  8 12:45:41 np0005477492 systemd: Created slice Slice /system/getty.
Oct  8 12:45:41 np0005477492 systemd: Created slice Slice /system/serial-getty.
Oct  8 12:45:41 np0005477492 systemd: Created slice Slice /system/sshd-keygen.
Oct  8 12:45:41 np0005477492 systemd: Created slice User and Session Slice.
Oct  8 12:45:41 np0005477492 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  8 12:45:41 np0005477492 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct  8 12:45:41 np0005477492 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct  8 12:45:41 np0005477492 systemd: Reached target Local Encrypted Volumes.
Oct  8 12:45:41 np0005477492 systemd: Stopped target Switch Root.
Oct  8 12:45:41 np0005477492 systemd: Stopped target Initrd File Systems.
Oct  8 12:45:41 np0005477492 systemd: Stopped target Initrd Root File System.
Oct  8 12:45:41 np0005477492 systemd: Reached target Local Integrity Protected Volumes.
Oct  8 12:45:41 np0005477492 systemd: Reached target Path Units.
Oct  8 12:45:41 np0005477492 systemd: Reached target rpc_pipefs.target.
Oct  8 12:45:41 np0005477492 systemd: Reached target Slice Units.
Oct  8 12:45:41 np0005477492 systemd: Reached target Swaps.
Oct  8 12:45:41 np0005477492 systemd: Reached target Local Verity Protected Volumes.
Oct  8 12:45:41 np0005477492 systemd: Listening on RPCbind Server Activation Socket.
Oct  8 12:45:41 np0005477492 systemd: Reached target RPC Port Mapper.
Oct  8 12:45:41 np0005477492 systemd: Listening on Process Core Dump Socket.
Oct  8 12:45:41 np0005477492 systemd: Listening on initctl Compatibility Named Pipe.
Oct  8 12:45:41 np0005477492 systemd: Listening on udev Control Socket.
Oct  8 12:45:41 np0005477492 systemd: Listening on udev Kernel Socket.
Oct  8 12:45:41 np0005477492 systemd: Mounting Huge Pages File System...
Oct  8 12:45:41 np0005477492 systemd: Mounting POSIX Message Queue File System...
Oct  8 12:45:41 np0005477492 systemd: Mounting Kernel Debug File System...
Oct  8 12:45:41 np0005477492 systemd: Mounting Kernel Trace File System...
Oct  8 12:45:41 np0005477492 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  8 12:45:41 np0005477492 systemd: Starting Create List of Static Device Nodes...
Oct  8 12:45:41 np0005477492 systemd: Starting Load Kernel Module configfs...
Oct  8 12:45:41 np0005477492 systemd: Starting Load Kernel Module drm...
Oct  8 12:45:41 np0005477492 systemd: Starting Load Kernel Module efi_pstore...
Oct  8 12:45:41 np0005477492 systemd: Starting Load Kernel Module fuse...
Oct  8 12:45:41 np0005477492 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct  8 12:45:41 np0005477492 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct  8 12:45:41 np0005477492 systemd: Stopped File System Check on Root Device.
Oct  8 12:45:41 np0005477492 systemd: Stopped Journal Service.
Oct  8 12:45:41 np0005477492 kernel: fuse: init (API version 7.37)
Oct  8 12:45:41 np0005477492 systemd: Starting Journal Service...
Oct  8 12:45:41 np0005477492 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  8 12:45:41 np0005477492 systemd: Starting Generate network units from Kernel command line...
Oct  8 12:45:41 np0005477492 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 12:45:41 np0005477492 systemd: Starting Remount Root and Kernel File Systems...
Oct  8 12:45:41 np0005477492 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct  8 12:45:41 np0005477492 systemd: Starting Apply Kernel Variables...
Oct  8 12:45:41 np0005477492 systemd: Starting Coldplug All udev Devices...
Oct  8 12:45:41 np0005477492 systemd-journald[677]: Journal started
Oct  8 12:45:41 np0005477492 systemd-journald[677]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  8 12:45:41 np0005477492 systemd[1]: Queued start job for default target Multi-User System.
Oct  8 12:45:41 np0005477492 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct  8 12:45:41 np0005477492 systemd: Started Journal Service.
Oct  8 12:45:41 np0005477492 systemd[1]: Mounted Huge Pages File System.
Oct  8 12:45:41 np0005477492 systemd[1]: Mounted POSIX Message Queue File System.
Oct  8 12:45:41 np0005477492 systemd[1]: Mounted Kernel Debug File System.
Oct  8 12:45:41 np0005477492 systemd[1]: Mounted Kernel Trace File System.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Create List of Static Device Nodes.
Oct  8 12:45:41 np0005477492 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Load Kernel Module configfs.
Oct  8 12:45:41 np0005477492 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct  8 12:45:41 np0005477492 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Load Kernel Module fuse.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Generate network units from Kernel command line.
Oct  8 12:45:41 np0005477492 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Apply Kernel Variables.
Oct  8 12:45:41 np0005477492 kernel: ACPI: bus type drm_connector registered
Oct  8 12:45:41 np0005477492 systemd[1]: Mounting FUSE Control File System...
Oct  8 12:45:41 np0005477492 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Rebuild Hardware Database...
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct  8 12:45:41 np0005477492 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Load/Save OS Random Seed...
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Create System Users...
Oct  8 12:45:41 np0005477492 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Load Kernel Module drm.
Oct  8 12:45:41 np0005477492 systemd[1]: Mounted FUSE Control File System.
Oct  8 12:45:41 np0005477492 systemd-journald[677]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  8 12:45:41 np0005477492 systemd-journald[677]: Received client request to flush runtime journal.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Load/Save OS Random Seed.
Oct  8 12:45:41 np0005477492 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Create System Users.
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Coldplug All udev Devices.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  8 12:45:41 np0005477492 systemd[1]: Reached target Preparation for Local File Systems.
Oct  8 12:45:41 np0005477492 systemd[1]: Reached target Local File Systems.
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct  8 12:45:41 np0005477492 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct  8 12:45:41 np0005477492 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct  8 12:45:41 np0005477492 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Automatic Boot Loader Update...
Oct  8 12:45:41 np0005477492 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Create Volatile Files and Directories...
Oct  8 12:45:41 np0005477492 bootctl[697]: Couldn't find EFI system partition, skipping.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Automatic Boot Loader Update.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Create Volatile Files and Directories.
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Security Auditing Service...
Oct  8 12:45:41 np0005477492 systemd[1]: Starting RPC Bind...
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Rebuild Journal Catalog...
Oct  8 12:45:41 np0005477492 auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct  8 12:45:41 np0005477492 auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct  8 12:45:41 np0005477492 systemd[1]: Started RPC Bind.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Rebuild Journal Catalog.
Oct  8 12:45:41 np0005477492 augenrules[708]: /sbin/augenrules: No change
Oct  8 12:45:41 np0005477492 augenrules[723]: No rules
Oct  8 12:45:41 np0005477492 augenrules[723]: enabled 1
Oct  8 12:45:41 np0005477492 augenrules[723]: failure 1
Oct  8 12:45:41 np0005477492 augenrules[723]: pid 703
Oct  8 12:45:41 np0005477492 augenrules[723]: rate_limit 0
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_limit 8192
Oct  8 12:45:41 np0005477492 augenrules[723]: lost 0
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog 3
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_wait_time 60000
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_wait_time_actual 0
Oct  8 12:45:41 np0005477492 augenrules[723]: enabled 1
Oct  8 12:45:41 np0005477492 augenrules[723]: failure 1
Oct  8 12:45:41 np0005477492 augenrules[723]: pid 703
Oct  8 12:45:41 np0005477492 augenrules[723]: rate_limit 0
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_limit 8192
Oct  8 12:45:41 np0005477492 augenrules[723]: lost 0
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog 4
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_wait_time 60000
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_wait_time_actual 0
Oct  8 12:45:41 np0005477492 augenrules[723]: enabled 1
Oct  8 12:45:41 np0005477492 augenrules[723]: failure 1
Oct  8 12:45:41 np0005477492 augenrules[723]: pid 703
Oct  8 12:45:41 np0005477492 augenrules[723]: rate_limit 0
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_limit 8192
Oct  8 12:45:41 np0005477492 augenrules[723]: lost 0
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog 8
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_wait_time 60000
Oct  8 12:45:41 np0005477492 augenrules[723]: backlog_wait_time_actual 0
Oct  8 12:45:41 np0005477492 systemd[1]: Started Security Auditing Service.
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Rebuild Hardware Database.
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  8 12:45:41 np0005477492 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct  8 12:45:41 np0005477492 systemd[1]: Starting Update is Completed...
Oct  8 12:45:42 np0005477492 systemd-udevd[731]: Using default interface naming scheme 'rhel-9.0'.
Oct  8 12:45:42 np0005477492 systemd[1]: Finished Update is Completed.
Oct  8 12:45:42 np0005477492 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  8 12:45:42 np0005477492 systemd[1]: Reached target System Initialization.
Oct  8 12:45:42 np0005477492 systemd[1]: Started dnf makecache --timer.
Oct  8 12:45:42 np0005477492 systemd[1]: Started Daily rotation of log files.
Oct  8 12:45:42 np0005477492 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct  8 12:45:42 np0005477492 systemd[1]: Reached target Timer Units.
Oct  8 12:45:42 np0005477492 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct  8 12:45:42 np0005477492 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct  8 12:45:42 np0005477492 systemd[1]: Reached target Socket Units.
Oct  8 12:45:42 np0005477492 systemd[1]: Starting D-Bus System Message Bus...
Oct  8 12:45:42 np0005477492 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 12:45:42 np0005477492 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct  8 12:45:42 np0005477492 systemd[1]: Starting Load Kernel Module configfs...
Oct  8 12:45:42 np0005477492 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  8 12:45:42 np0005477492 systemd[1]: Finished Load Kernel Module configfs.
Oct  8 12:45:42 np0005477492 systemd-udevd[736]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 12:45:42 np0005477492 systemd[1]: Started D-Bus System Message Bus.
Oct  8 12:45:42 np0005477492 systemd[1]: Reached target Basic System.
Oct  8 12:45:42 np0005477492 dbus-broker-lau[743]: Ready
Oct  8 12:45:42 np0005477492 systemd[1]: Starting NTP client/server...
Oct  8 12:45:42 np0005477492 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct  8 12:45:42 np0005477492 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct  8 12:45:42 np0005477492 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct  8 12:45:42 np0005477492 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct  8 12:45:42 np0005477492 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct  8 12:45:42 np0005477492 systemd[1]: Starting IPv4 firewall with iptables...
Oct  8 12:45:42 np0005477492 systemd[1]: Started irqbalance daemon.
Oct  8 12:45:42 np0005477492 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct  8 12:45:42 np0005477492 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 12:45:42 np0005477492 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 12:45:42 np0005477492 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 12:45:42 np0005477492 systemd[1]: Reached target sshd-keygen.target.
Oct  8 12:45:42 np0005477492 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct  8 12:45:42 np0005477492 systemd[1]: Reached target User and Group Name Lookups.
Oct  8 12:45:42 np0005477492 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct  8 12:45:42 np0005477492 systemd[1]: Starting User Login Management...
Oct  8 12:45:42 np0005477492 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct  8 12:45:42 np0005477492 chronyd[794]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  8 12:45:42 np0005477492 chronyd[794]: Loaded 0 symmetric keys
Oct  8 12:45:42 np0005477492 chronyd[794]: Using right/UTC timezone to obtain leap second data
Oct  8 12:45:42 np0005477492 chronyd[794]: Loaded seccomp filter (level 2)
Oct  8 12:45:42 np0005477492 systemd[1]: Started NTP client/server.
Oct  8 12:45:42 np0005477492 systemd-logind[786]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  8 12:45:42 np0005477492 systemd-logind[786]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  8 12:45:42 np0005477492 systemd-logind[786]: New seat seat0.
Oct  8 12:45:42 np0005477492 systemd[1]: Started User Login Management.
Oct  8 12:45:42 np0005477492 kernel: kvm_amd: TSC scaling supported
Oct  8 12:45:42 np0005477492 kernel: kvm_amd: Nested Virtualization enabled
Oct  8 12:45:42 np0005477492 kernel: kvm_amd: Nested Paging enabled
Oct  8 12:45:42 np0005477492 kernel: kvm_amd: LBR virtualization supported
Oct  8 12:45:42 np0005477492 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct  8 12:45:42 np0005477492 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct  8 12:45:42 np0005477492 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct  8 12:45:42 np0005477492 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct  8 12:45:42 np0005477492 kernel: Console: switching to colour dummy device 80x25
Oct  8 12:45:42 np0005477492 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct  8 12:45:42 np0005477492 kernel: [drm] features: -context_init
Oct  8 12:45:42 np0005477492 kernel: [drm] number of scanouts: 1
Oct  8 12:45:42 np0005477492 kernel: [drm] number of cap sets: 0
Oct  8 12:45:42 np0005477492 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct  8 12:45:42 np0005477492 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct  8 12:45:42 np0005477492 kernel: Console: switching to colour frame buffer device 128x48
Oct  8 12:45:42 np0005477492 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct  8 12:45:42 np0005477492 iptables.init[779]: iptables: Applying firewall rules: [  OK  ]
Oct  8 12:45:42 np0005477492 systemd[1]: Finished IPv4 firewall with iptables.
Oct  8 12:45:42 np0005477492 cloud-init[839]: Cloud-init v. 24.4-7.el9 running 'init-local' at Wed, 08 Oct 2025 16:45:42 +0000. Up 6.49 seconds.
Oct  8 12:45:43 np0005477492 systemd[1]: run-cloud\x2dinit-tmp-tmp9lhq_lza.mount: Deactivated successfully.
Oct  8 12:45:43 np0005477492 systemd[1]: Starting Hostname Service...
Oct  8 12:45:43 np0005477492 systemd[1]: Started Hostname Service.
Oct  8 12:45:43 np0005477492 systemd-hostnamed[853]: Hostname set to <np0005477492.novalocal> (static)
Oct  8 12:45:43 np0005477492 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct  8 12:45:43 np0005477492 systemd[1]: Reached target Preparation for Network.
Oct  8 12:45:43 np0005477492 systemd[1]: Starting Network Manager...
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4187] NetworkManager (version 1.54.1-1.el9) is starting... (boot:cb0c00bd-184a-4765-8005-06a5fc6550cb)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4192] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4372] manager[0x55e24be06080]: monitoring kernel firmware directory '/lib/firmware'.
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4436] hostname: hostname: using hostnamed
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4437] hostname: static hostname changed from (none) to "np0005477492.novalocal"
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4440] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4553] manager[0x55e24be06080]: rfkill: Wi-Fi hardware radio set enabled
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4553] manager[0x55e24be06080]: rfkill: WWAN hardware radio set enabled
Oct  8 12:45:43 np0005477492 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4642] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4642] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4643] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4644] manager: Networking is enabled by state file
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4647] settings: Loaded settings plugin: keyfile (internal)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4674] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4706] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4730] dhcp: init: Using DHCP client 'internal'
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4734] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4750] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4763] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4773] device (lo): Activation: starting connection 'lo' (da957721-2ac9-44f5-bcd8-228e504809c9)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4784] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4788] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 12:45:43 np0005477492 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4849] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4853] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4857] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 12:45:43 np0005477492 systemd[1]: Started Network Manager.
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4860] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4863] device (eth0): carrier: link connected
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4866] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 12:45:43 np0005477492 systemd[1]: Reached target Network.
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4875] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4893] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4899] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4901] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 12:45:43 np0005477492 systemd[1]: Starting Network Manager Wait Online...
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4904] manager: NetworkManager state is now CONNECTING
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4906] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4920] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.4924] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 12:45:43 np0005477492 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct  8 12:45:43 np0005477492 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.5089] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.5103] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.5113] device (lo): Activation: successful, device activated.
Oct  8 12:45:43 np0005477492 systemd[1]: Started GSSAPI Proxy Daemon.
Oct  8 12:45:43 np0005477492 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  8 12:45:43 np0005477492 systemd[1]: Reached target NFS client services.
Oct  8 12:45:43 np0005477492 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  8 12:45:43 np0005477492 systemd[1]: Reached target Remote File Systems.
Oct  8 12:45:43 np0005477492 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7831] dhcp4 (eth0): state changed new lease, address=38.102.83.120
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7851] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7888] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7926] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7928] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7930] manager: NetworkManager state is now CONNECTED_SITE
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7933] device (eth0): Activation: successful, device activated.
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7938] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  8 12:45:43 np0005477492 NetworkManager[857]: <info>  [1759941943.7939] manager: startup complete
Oct  8 12:45:43 np0005477492 systemd[1]: Finished Network Manager Wait Online.
Oct  8 12:45:43 np0005477492 systemd[1]: Starting Cloud-init: Network Stage...
Oct  8 12:45:44 np0005477492 cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Wed, 08 Oct 2025 16:45:44 +0000. Up 7.80 seconds.
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |  eth0  | True |        38.102.83.120         | 255.255.255.0 | global | fa:16:3e:22:ef:71 |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe22:ef71/64 |       .       |  link  | fa:16:3e:22:ef:71 |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct  8 12:45:44 np0005477492 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 12:45:45 np0005477492 cloud-init[922]: Generating public/private rsa key pair.
Oct  8 12:45:45 np0005477492 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct  8 12:45:45 np0005477492 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct  8 12:45:45 np0005477492 cloud-init[922]: The key fingerprint is:
Oct  8 12:45:45 np0005477492 cloud-init[922]: SHA256:h47UbvGW5EDNmvtOkNKiujNLBg8kEhxyDePrJih2+F8 root@np0005477492.novalocal
Oct  8 12:45:45 np0005477492 cloud-init[922]: The key's randomart image is:
Oct  8 12:45:45 np0005477492 cloud-init[922]: +---[RSA 3072]----+
Oct  8 12:45:45 np0005477492 cloud-init[922]: |+.=o             |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |.= ..    o       |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |o..     . o      |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |+  .   + =       |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |o .   + S o      |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |.=.  o * X .     |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |+o*.. .E= *      |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |o==.  .. +       |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |  +*..   .o      |
Oct  8 12:45:45 np0005477492 cloud-init[922]: +----[SHA256]-----+
Oct  8 12:45:45 np0005477492 cloud-init[922]: Generating public/private ecdsa key pair.
Oct  8 12:45:45 np0005477492 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct  8 12:45:45 np0005477492 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct  8 12:45:45 np0005477492 cloud-init[922]: The key fingerprint is:
Oct  8 12:45:45 np0005477492 cloud-init[922]: SHA256:z0Uiu4oYA5YQSXpzRXDIa+WOxcxr8jGwGLkLyl9KG3k root@np0005477492.novalocal
Oct  8 12:45:45 np0005477492 cloud-init[922]: The key's randomart image is:
Oct  8 12:45:45 np0005477492 cloud-init[922]: +---[ECDSA 256]---+
Oct  8 12:45:45 np0005477492 cloud-init[922]: |oo ..++          |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |o.  oo.          |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |o o..B  . . .    |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |..+o+ *  o o     |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |.o = * .S   .    |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |o.o = *  + .     |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |o.o= E o. o      |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |..o+*...         |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |  o+. .          |
Oct  8 12:45:45 np0005477492 cloud-init[922]: +----[SHA256]-----+
Oct  8 12:45:45 np0005477492 cloud-init[922]: Generating public/private ed25519 key pair.
Oct  8 12:45:45 np0005477492 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct  8 12:45:45 np0005477492 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct  8 12:45:45 np0005477492 cloud-init[922]: The key fingerprint is:
Oct  8 12:45:45 np0005477492 cloud-init[922]: SHA256:vBAhE/RcdUQQVZlhUH0uQdkfaL+KL59lBB5ShHQJP5o root@np0005477492.novalocal
Oct  8 12:45:45 np0005477492 cloud-init[922]: The key's randomart image is:
Oct  8 12:45:45 np0005477492 cloud-init[922]: +--[ED25519 256]--+
Oct  8 12:45:45 np0005477492 cloud-init[922]: |   .=.. ..=OXOOB |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |     = o   o+==.+|
Oct  8 12:45:45 np0005477492 cloud-init[922]: |      +    ..=.o+|
Oct  8 12:45:45 np0005477492 cloud-init[922]: |       o    = =.o|
Oct  8 12:45:45 np0005477492 cloud-init[922]: |      . S  E . o.|
Oct  8 12:45:45 np0005477492 cloud-init[922]: |       . .    .. |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |        .   . .o |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |           o .+  |
Oct  8 12:45:45 np0005477492 cloud-init[922]: |            ++   |
Oct  8 12:45:45 np0005477492 cloud-init[922]: +----[SHA256]-----+
Oct  8 12:45:45 np0005477492 systemd[1]: Finished Cloud-init: Network Stage.
Oct  8 12:45:45 np0005477492 systemd[1]: Reached target Cloud-config availability.
Oct  8 12:45:45 np0005477492 systemd[1]: Reached target Network is Online.
Oct  8 12:45:45 np0005477492 systemd[1]: Starting Cloud-init: Config Stage...
Oct  8 12:45:45 np0005477492 systemd[1]: Starting Notify NFS peers of a restart...
Oct  8 12:45:45 np0005477492 systemd[1]: Starting System Logging Service...
Oct  8 12:45:45 np0005477492 sm-notify[1003]: Version 2.5.4 starting
Oct  8 12:45:45 np0005477492 systemd[1]: Starting OpenSSH server daemon...
Oct  8 12:45:45 np0005477492 systemd[1]: Starting Permit User Sessions...
Oct  8 12:45:45 np0005477492 systemd[1]: Started Notify NFS peers of a restart.
Oct  8 12:45:45 np0005477492 systemd[1]: Finished Permit User Sessions.
Oct  8 12:45:45 np0005477492 rsyslogd[1004]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1004" x-info="https://www.rsyslog.com"] start
Oct  8 12:45:45 np0005477492 rsyslogd[1004]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct  8 12:45:45 np0005477492 systemd[1]: Started Command Scheduler.
Oct  8 12:45:45 np0005477492 systemd[1]: Started Getty on tty1.
Oct  8 12:45:45 np0005477492 systemd[1]: Started Serial Getty on ttyS0.
Oct  8 12:45:45 np0005477492 systemd[1]: Reached target Login Prompts.
Oct  8 12:45:45 np0005477492 systemd[1]: Started OpenSSH server daemon.
Oct  8 12:45:45 np0005477492 systemd[1]: Started System Logging Service.
Oct  8 12:45:45 np0005477492 systemd[1]: Reached target Multi-User System.
Oct  8 12:45:45 np0005477492 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct  8 12:45:45 np0005477492 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct  8 12:45:45 np0005477492 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct  8 12:45:45 np0005477492 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 12:45:46 np0005477492 cloud-init[1017]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Wed, 08 Oct 2025 16:45:45 +0000. Up 9.65 seconds.
Oct  8 12:45:46 np0005477492 systemd[1]: Finished Cloud-init: Config Stage.
Oct  8 12:45:46 np0005477492 systemd[1]: Starting Cloud-init: Final Stage...
Oct  8 12:45:46 np0005477492 cloud-init[1021]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Wed, 08 Oct 2025 16:45:46 +0000. Up 10.02 seconds.
Oct  8 12:45:46 np0005477492 cloud-init[1023]: #############################################################
Oct  8 12:45:46 np0005477492 cloud-init[1024]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct  8 12:45:46 np0005477492 cloud-init[1026]: 256 SHA256:z0Uiu4oYA5YQSXpzRXDIa+WOxcxr8jGwGLkLyl9KG3k root@np0005477492.novalocal (ECDSA)
Oct  8 12:45:46 np0005477492 cloud-init[1028]: 256 SHA256:vBAhE/RcdUQQVZlhUH0uQdkfaL+KL59lBB5ShHQJP5o root@np0005477492.novalocal (ED25519)
Oct  8 12:45:46 np0005477492 cloud-init[1030]: 3072 SHA256:h47UbvGW5EDNmvtOkNKiujNLBg8kEhxyDePrJih2+F8 root@np0005477492.novalocal (RSA)
Oct  8 12:45:46 np0005477492 cloud-init[1031]: -----END SSH HOST KEY FINGERPRINTS-----
Oct  8 12:45:46 np0005477492 cloud-init[1032]: #############################################################
Oct  8 12:45:46 np0005477492 cloud-init[1021]: Cloud-init v. 24.4-7.el9 finished at Wed, 08 Oct 2025 16:45:46 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.23 seconds
Oct  8 12:45:46 np0005477492 systemd[1]: Finished Cloud-init: Final Stage.
Oct  8 12:45:46 np0005477492 systemd[1]: Reached target Cloud-init target.
Oct  8 12:45:46 np0005477492 systemd[1]: Startup finished in 1.634s (kernel) + 2.591s (initrd) + 6.085s (userspace) = 10.311s.
Oct  8 12:45:48 np0005477492 chronyd[794]: Selected source 162.159.200.123 (2.centos.pool.ntp.org)
Oct  8 12:45:50 np0005477492 chronyd[794]: System clock wrong by 1.677660 seconds
Oct  8 12:45:50 np0005477492 chronyd[794]: System clock was stepped by 1.677660 seconds
Oct  8 12:45:50 np0005477492 chronyd[794]: System clock TAI offset set to 37 seconds
Oct  8 12:45:51 np0005477492 chronyd[794]: Selected source 174.142.148.226 (2.centos.pool.ntp.org)
Oct  8 12:45:53 np0005477492 irqbalance[781]: Cannot change IRQ 25 affinity: Operation not permitted
Oct  8 12:45:54 np0005477492 irqbalance[781]: IRQ 25 affinity is now unmanaged
Oct  8 12:45:54 np0005477492 irqbalance[781]: Cannot change IRQ 31 affinity: Operation not permitted
Oct  8 12:45:54 np0005477492 irqbalance[781]: IRQ 31 affinity is now unmanaged
Oct  8 12:45:54 np0005477492 irqbalance[781]: Cannot change IRQ 28 affinity: Operation not permitted
Oct  8 12:45:54 np0005477492 irqbalance[781]: IRQ 28 affinity is now unmanaged
Oct  8 12:45:54 np0005477492 irqbalance[781]: Cannot change IRQ 32 affinity: Operation not permitted
Oct  8 12:45:54 np0005477492 irqbalance[781]: IRQ 32 affinity is now unmanaged
Oct  8 12:45:54 np0005477492 irqbalance[781]: Cannot change IRQ 30 affinity: Operation not permitted
Oct  8 12:45:54 np0005477492 irqbalance[781]: IRQ 30 affinity is now unmanaged
Oct  8 12:45:54 np0005477492 irqbalance[781]: Cannot change IRQ 29 affinity: Operation not permitted
Oct  8 12:45:54 np0005477492 irqbalance[781]: IRQ 29 affinity is now unmanaged
Oct  8 12:45:55 np0005477492 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 12:46:15 np0005477492 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 13:01:01 np0005477492 systemd[1]: Starting Cleanup of Temporary Directories...
Oct  8 13:01:01 np0005477492 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct  8 13:01:01 np0005477492 systemd[1]: Finished Cleanup of Temporary Directories.
Oct  8 13:01:01 np0005477492 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct  8 13:16:03 np0005477492 systemd[1]: Starting dnf makecache...
Oct  8 13:16:04 np0005477492 dnf[1102]: Failed determining last makecache time.
Oct  8 13:16:04 np0005477492 dnf[1102]: CentOS Stream 9 - BaseOS                         32 kB/s | 6.7 kB     00:00
Oct  8 13:16:04 np0005477492 dnf[1102]: CentOS Stream 9 - AppStream                      58 kB/s | 6.8 kB     00:00
Oct  8 13:16:05 np0005477492 dnf[1102]: CentOS Stream 9 - CRB                            61 kB/s | 6.6 kB     00:00
Oct  8 13:16:05 np0005477492 dnf[1102]: CentOS Stream 9 - Extras packages                75 kB/s | 8.0 kB     00:00
Oct  8 13:16:05 np0005477492 dnf[1102]: Metadata cache created.
Oct  8 13:16:05 np0005477492 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  8 13:16:05 np0005477492 systemd[1]: Finished dnf makecache.
Oct  8 14:13:34 np0005477492 systemd[1]: Created slice User Slice of UID 1000.
Oct  8 14:13:34 np0005477492 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  8 14:13:34 np0005477492 systemd-logind[786]: New session 1 of user zuul.
Oct  8 14:13:34 np0005477492 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  8 14:13:34 np0005477492 systemd[1]: Starting User Manager for UID 1000...
Oct  8 14:13:34 np0005477492 systemd[1173]: Queued start job for default target Main User Target.
Oct  8 14:13:34 np0005477492 systemd[1173]: Created slice User Application Slice.
Oct  8 14:13:34 np0005477492 systemd[1173]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 14:13:34 np0005477492 systemd[1173]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 14:13:34 np0005477492 systemd[1173]: Reached target Paths.
Oct  8 14:13:34 np0005477492 systemd[1173]: Reached target Timers.
Oct  8 14:13:34 np0005477492 systemd[1173]: Starting D-Bus User Message Bus Socket...
Oct  8 14:13:34 np0005477492 systemd[1173]: Starting Create User's Volatile Files and Directories...
Oct  8 14:13:34 np0005477492 systemd[1173]: Listening on D-Bus User Message Bus Socket.
Oct  8 14:13:34 np0005477492 systemd[1173]: Reached target Sockets.
Oct  8 14:13:34 np0005477492 systemd[1173]: Finished Create User's Volatile Files and Directories.
Oct  8 14:13:34 np0005477492 systemd[1173]: Reached target Basic System.
Oct  8 14:13:34 np0005477492 systemd[1173]: Reached target Main User Target.
Oct  8 14:13:34 np0005477492 systemd[1173]: Startup finished in 143ms.
Oct  8 14:13:34 np0005477492 systemd[1]: Started User Manager for UID 1000.
Oct  8 14:13:34 np0005477492 systemd[1]: Started Session 1 of User zuul.
Oct  8 14:13:35 np0005477492 python3[1258]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:13:37 np0005477492 python3[1286]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:13:43 np0005477492 python3[1344]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:13:44 np0005477492 python3[1384]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct  8 14:13:46 np0005477492 python3[1410]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCobw03zNfvEDJBvlfoQSer7BrusIjbT7QoXKmhLheqGV1zGmAT+R/Q0brtOv2MtIOb2inxhE1PYOx1xcvUvOlbIvZ6NO+wfLYLQj8ZlafVc+vK/AZhFubKWGhfYmw1eraBror6R2pG3MmlDMFUUukzW8o5cRzR1Jj3gJ7Y1tL4tMjOw+v9oYxIR/1l9lmOaPFmJoSIMIPXWbq2n/YH3iX83PCsxiYSkk7vJ4BNmP7I8RhwfZQ5RLp4gFDvzYCoivcfwwt6MoShJ9zhI/XqrSNF3j3ibf/eyDiVaqvgRcGlUfdJNCsBm0bUAeDV9VcclpavfSm56JHPTPScV+L6dlGhHifXvkOwXOEv4LdFy4dQWwIC3FlNXMUac6JxyBaOkjJd6O4uDSU0kkB7Kh7QlucoRh053L4l/m7lneNFJWyUcubgNJQiboqIw3Tj1od0RKHqhKgp028oXiNr/ZdZp/Yv8e1ot167LMnF3znBp/P5zGiDvhIhta9yMmejxilY5+s= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:13:46 np0005477492 python3[1434]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:47 np0005477492 python3[1533]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:13:47 np0005477492 python3[1604]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759947226.9605422-207-258256760049725/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=72d072d8437d4abb88b650c3bafd2958_id_rsa follow=False checksum=6aab32b389297ad3213fee67f911c0b4bb30ac8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:48 np0005477492 python3[1727]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:13:48 np0005477492 python3[1798]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759947227.917303-240-114289845138245/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=72d072d8437d4abb88b650c3bafd2958_id_rsa.pub follow=False checksum=c13146420f10036cba44af4c2dd1b1763ef32c09 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:49 np0005477492 python3[1846]: ansible-ping Invoked with data=pong
Oct  8 14:13:50 np0005477492 python3[1870]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:13:52 np0005477492 python3[1928]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct  8 14:13:53 np0005477492 python3[1960]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:53 np0005477492 python3[1984]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:54 np0005477492 python3[2008]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:54 np0005477492 python3[2032]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:54 np0005477492 python3[2056]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:55 np0005477492 python3[2080]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:56 np0005477492 python3[2106]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:57 np0005477492 python3[2184]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:13:57 np0005477492 python3[2257]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759947236.74247-21-102152316905474/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:13:58 np0005477492 python3[2305]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:13:58 np0005477492 python3[2329]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:13:58 np0005477492 python3[2353]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:13:59 np0005477492 python3[2377]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:13:59 np0005477492 python3[2401]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:13:59 np0005477492 python3[2425]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:13:59 np0005477492 python3[2449]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:00 np0005477492 python3[2473]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:00 np0005477492 python3[2497]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:00 np0005477492 python3[2521]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:01 np0005477492 python3[2545]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:01 np0005477492 python3[2569]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:01 np0005477492 python3[2593]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:01 np0005477492 python3[2617]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:02 np0005477492 python3[2641]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:02 np0005477492 python3[2665]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:02 np0005477492 python3[2689]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:03 np0005477492 python3[2713]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:03 np0005477492 python3[2737]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:03 np0005477492 python3[2761]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:03 np0005477492 python3[2785]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:04 np0005477492 python3[2809]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:04 np0005477492 python3[2833]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:04 np0005477492 python3[2857]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:05 np0005477492 python3[2881]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:05 np0005477492 python3[2905]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:14:08 np0005477492 python3[2931]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  8 14:14:08 np0005477492 systemd[1]: Starting Time & Date Service...
Oct  8 14:14:08 np0005477492 systemd[1]: Started Time & Date Service.
Oct  8 14:14:09 np0005477492 systemd-timedated[2933]: Changed time zone to 'UTC' (UTC).
Oct  8 14:14:09 np0005477492 python3[2962]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:14:10 np0005477492 python3[3038]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:14:10 np0005477492 python3[3109]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759947249.757911-153-274905988808541/source _original_basename=tmp20z9e5ak follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:14:11 np0005477492 python3[3209]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:14:11 np0005477492 python3[3280]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759947250.7097917-183-193243939281306/source _original_basename=tmp3rtqn0j8 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:14:12 np0005477492 python3[3382]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:14:12 np0005477492 python3[3455]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759947251.8357704-231-236881757978110/source _original_basename=tmpdbfg8c1k follow=False checksum=b72a4c76bf5dc99bdd97f862087672d87e62b0ee backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:14:13 np0005477492 python3[3503]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:14:13 np0005477492 python3[3529]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:14:13 np0005477492 python3[3609]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:14:14 np0005477492 python3[3682]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759947253.5442102-273-977812052100/source _original_basename=tmpef2piuok follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:14:14 np0005477492 python3[3733]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-bc0b-5f06-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:14:15 np0005477492 python3[3761]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-bc0b-5f06-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct  8 14:14:16 np0005477492 python3[3790]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:14:33 np0005477492 python3[3816]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:14:39 np0005477492 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct  8 14:15:08 np0005477492 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct  8 14:15:08 np0005477492 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6659] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  8 14:15:08 np0005477492 systemd-udevd[3820]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6912] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6946] settings: (eth1): created default wired connection 'Wired connection 1'
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6951] device (eth1): carrier: link connected
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6954] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6962] policy: auto-activating connection 'Wired connection 1' (8bdaa810-1639-3a4a-8290-5b06515ee0ed)
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6967] device (eth1): Activation: starting connection 'Wired connection 1' (8bdaa810-1639-3a4a-8290-5b06515ee0ed)
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6968] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6972] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6977] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:15:08 np0005477492 NetworkManager[857]: <info>  [1759947308.6982] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  8 14:15:09 np0005477492 python3[3847]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-1d4c-6e7b-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:15:16 np0005477492 python3[3928]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:15:16 np0005477492 python3[4001]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759947316.2101626-102-267398303920290/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=27aa3ff62700b868632c26bba0f990edf6a232f7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:15:17 np0005477492 python3[4051]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 14:15:17 np0005477492 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  8 14:15:17 np0005477492 systemd[1]: Stopped Network Manager Wait Online.
Oct  8 14:15:17 np0005477492 systemd[1]: Stopping Network Manager Wait Online...
Oct  8 14:15:17 np0005477492 NetworkManager[857]: <info>  [1759947317.7506] caught SIGTERM, shutting down normally.
Oct  8 14:15:17 np0005477492 systemd[1]: Stopping Network Manager...
Oct  8 14:15:17 np0005477492 NetworkManager[857]: <info>  [1759947317.7515] dhcp4 (eth0): canceled DHCP transaction
Oct  8 14:15:17 np0005477492 NetworkManager[857]: <info>  [1759947317.7515] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 14:15:17 np0005477492 NetworkManager[857]: <info>  [1759947317.7515] dhcp4 (eth0): state changed no lease
Oct  8 14:15:17 np0005477492 NetworkManager[857]: <info>  [1759947317.7517] manager: NetworkManager state is now CONNECTING
Oct  8 14:15:17 np0005477492 NetworkManager[857]: <info>  [1759947317.7604] dhcp4 (eth1): canceled DHCP transaction
Oct  8 14:15:17 np0005477492 NetworkManager[857]: <info>  [1759947317.7605] dhcp4 (eth1): state changed no lease
Oct  8 14:15:17 np0005477492 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 14:15:17 np0005477492 NetworkManager[857]: <info>  [1759947317.7664] exiting (success)
Oct  8 14:15:17 np0005477492 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 14:15:17 np0005477492 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  8 14:15:17 np0005477492 systemd[1]: Stopped Network Manager.
Oct  8 14:15:17 np0005477492 systemd[1]: NetworkManager.service: Consumed 27.815s CPU time, 9.9M memory peak.
Oct  8 14:15:17 np0005477492 systemd[1]: Starting Network Manager...
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.8196] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:cb0c00bd-184a-4765-8005-06a5fc6550cb)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.8200] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.8254] manager[0x55734ea42070]: monitoring kernel firmware directory '/lib/firmware'.
Oct  8 14:15:17 np0005477492 systemd[1]: Starting Hostname Service...
Oct  8 14:15:17 np0005477492 systemd[1]: Started Hostname Service.
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9009] hostname: hostname: using hostnamed
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9009] hostname: static hostname changed from (none) to "np0005477492.novalocal"
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9015] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9021] manager[0x55734ea42070]: rfkill: Wi-Fi hardware radio set enabled
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9021] manager[0x55734ea42070]: rfkill: WWAN hardware radio set enabled
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9048] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9049] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9049] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9049] manager: Networking is enabled by state file
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9051] settings: Loaded settings plugin: keyfile (internal)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9056] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9079] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9088] dhcp: init: Using DHCP client 'internal'
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9091] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9095] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9099] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9105] device (lo): Activation: starting connection 'lo' (da957721-2ac9-44f5-bcd8-228e504809c9)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9111] device (eth0): carrier: link connected
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9115] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9118] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9119] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9123] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9127] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9132] device (eth1): carrier: link connected
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9135] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9138] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (8bdaa810-1639-3a4a-8290-5b06515ee0ed) (indicated)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9138] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9141] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9146] device (eth1): Activation: starting connection 'Wired connection 1' (8bdaa810-1639-3a4a-8290-5b06515ee0ed)
Oct  8 14:15:17 np0005477492 systemd[1]: Started Network Manager.
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9151] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9154] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9155] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9156] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9158] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9160] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9161] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9162] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9164] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9168] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9174] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9183] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9186] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9201] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9202] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 14:15:17 np0005477492 NetworkManager[4064]: <info>  [1759947317.9207] device (lo): Activation: successful, device activated.
Oct  8 14:15:17 np0005477492 systemd[1]: Starting Network Manager Wait Online...
Oct  8 14:15:18 np0005477492 python3[4117]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-1d4c-6e7b-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:15:19 np0005477492 NetworkManager[4064]: <info>  [1759947319.8699] dhcp4 (eth0): state changed new lease, address=38.102.83.120
Oct  8 14:15:19 np0005477492 NetworkManager[4064]: <info>  [1759947319.8716] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  8 14:15:19 np0005477492 NetworkManager[4064]: <info>  [1759947319.8789] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  8 14:15:19 np0005477492 NetworkManager[4064]: <info>  [1759947319.8821] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  8 14:15:19 np0005477492 NetworkManager[4064]: <info>  [1759947319.8823] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  8 14:15:19 np0005477492 NetworkManager[4064]: <info>  [1759947319.8828] manager: NetworkManager state is now CONNECTED_SITE
Oct  8 14:15:19 np0005477492 NetworkManager[4064]: <info>  [1759947319.8831] device (eth0): Activation: successful, device activated.
Oct  8 14:15:19 np0005477492 NetworkManager[4064]: <info>  [1759947319.8837] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  8 14:15:29 np0005477492 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 14:15:47 np0005477492 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 14:16:02 np0005477492 NetworkManager[4064]: <info>  [1759947362.9886] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  8 14:16:03 np0005477492 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 14:16:03 np0005477492 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0232] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0237] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0253] device (eth1): Activation: successful, device activated.
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0265] manager: startup complete
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0268] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <warn>  [1759947363.0282] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0294] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct  8 14:16:03 np0005477492 systemd[1]: Finished Network Manager Wait Online.
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0455] dhcp4 (eth1): canceled DHCP transaction
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0456] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0456] dhcp4 (eth1): state changed no lease
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0480] policy: auto-activating connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0488] device (eth1): Activation: starting connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0489] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0492] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0503] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0515] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0570] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0573] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:16:03 np0005477492 NetworkManager[4064]: <info>  [1759947363.0586] device (eth1): Activation: successful, device activated.
Oct  8 14:16:13 np0005477492 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 14:16:14 np0005477492 python3[4244]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:16:14 np0005477492 python3[4317]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759947373.9897876-259-193681390448784/source _original_basename=tmp6mly94i7 follow=False checksum=ba1a41cacae7c5f9c6c90c9809969ae695187c2f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:16:23 np0005477492 systemd[1173]: Starting Mark boot as successful...
Oct  8 14:16:23 np0005477492 systemd[1173]: Finished Mark boot as successful.
Oct  8 14:17:14 np0005477492 systemd-logind[786]: Session 1 logged out. Waiting for processes to exit.
Oct  8 14:19:23 np0005477492 systemd[1173]: Created slice User Background Tasks Slice.
Oct  8 14:19:23 np0005477492 systemd[1173]: Starting Cleanup of User's Temporary Files and Directories...
Oct  8 14:19:23 np0005477492 systemd[1173]: Finished Cleanup of User's Temporary Files and Directories.
Oct  8 14:21:41 np0005477492 systemd-logind[786]: New session 3 of user zuul.
Oct  8 14:21:41 np0005477492 systemd[1]: Started Session 3 of User zuul.
Oct  8 14:21:42 np0005477492 python3[4382]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-0c3e-1585-000000001cea-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:21:42 np0005477492 python3[4411]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:21:42 np0005477492 python3[4437]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:21:42 np0005477492 python3[4463]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:21:43 np0005477492 python3[4489]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:21:43 np0005477492 python3[4515]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:21:43 np0005477492 python3[4515]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct  8 14:21:44 np0005477492 python3[4541]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 14:21:44 np0005477492 systemd[1]: Reloading.
Oct  8 14:21:44 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:21:46 np0005477492 python3[4597]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct  8 14:21:46 np0005477492 python3[4624]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:21:46 np0005477492 python3[4652]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:21:47 np0005477492 python3[4680]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:21:47 np0005477492 python3[4708]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:21:48 np0005477492 python3[4735]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-0c3e-1585-000000001cf0-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:21:48 np0005477492 python3[4765]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:21:50 np0005477492 systemd[1]: session-3.scope: Deactivated successfully.
Oct  8 14:21:50 np0005477492 systemd[1]: session-3.scope: Consumed 3.824s CPU time.
Oct  8 14:21:50 np0005477492 systemd-logind[786]: Session 3 logged out. Waiting for processes to exit.
Oct  8 14:21:50 np0005477492 systemd-logind[786]: Removed session 3.
Oct  8 14:21:52 np0005477492 systemd-logind[786]: New session 4 of user zuul.
Oct  8 14:21:52 np0005477492 systemd[1]: Started Session 4 of User zuul.
Oct  8 14:21:52 np0005477492 python3[4799]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct  8 14:21:54 np0005477492 irqbalance[781]: Cannot change IRQ 27 affinity: Operation not permitted
Oct  8 14:21:54 np0005477492 irqbalance[781]: IRQ 27 affinity is now unmanaged
Oct  8 14:22:09 np0005477492 kernel: SELinux:  Converting 366 SID table entries...
Oct  8 14:22:09 np0005477492 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 14:22:09 np0005477492 kernel: SELinux:  policy capability open_perms=1
Oct  8 14:22:09 np0005477492 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 14:22:09 np0005477492 kernel: SELinux:  policy capability always_check_network=0
Oct  8 14:22:09 np0005477492 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 14:22:09 np0005477492 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 14:22:09 np0005477492 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 14:22:18 np0005477492 kernel: SELinux:  Converting 366 SID table entries...
Oct  8 14:22:18 np0005477492 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 14:22:18 np0005477492 kernel: SELinux:  policy capability open_perms=1
Oct  8 14:22:18 np0005477492 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 14:22:18 np0005477492 kernel: SELinux:  policy capability always_check_network=0
Oct  8 14:22:18 np0005477492 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 14:22:18 np0005477492 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 14:22:18 np0005477492 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 14:22:27 np0005477492 kernel: SELinux:  Converting 366 SID table entries...
Oct  8 14:22:27 np0005477492 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 14:22:27 np0005477492 kernel: SELinux:  policy capability open_perms=1
Oct  8 14:22:27 np0005477492 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 14:22:27 np0005477492 kernel: SELinux:  policy capability always_check_network=0
Oct  8 14:22:27 np0005477492 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 14:22:27 np0005477492 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 14:22:27 np0005477492 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 14:22:28 np0005477492 setsebool[4860]: The virt_use_nfs policy boolean was changed to 1 by root
Oct  8 14:22:28 np0005477492 setsebool[4860]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct  8 14:22:39 np0005477492 kernel: SELinux:  Converting 369 SID table entries...
Oct  8 14:22:39 np0005477492 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 14:22:39 np0005477492 kernel: SELinux:  policy capability open_perms=1
Oct  8 14:22:39 np0005477492 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 14:22:39 np0005477492 kernel: SELinux:  policy capability always_check_network=0
Oct  8 14:22:39 np0005477492 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 14:22:39 np0005477492 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 14:22:39 np0005477492 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 14:22:58 np0005477492 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  8 14:22:58 np0005477492 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 14:22:58 np0005477492 systemd[1]: Starting man-db-cache-update.service...
Oct  8 14:22:58 np0005477492 systemd[1]: Reloading.
Oct  8 14:22:58 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:22:58 np0005477492 systemd[1]: Starting dnf makecache...
Oct  8 14:22:58 np0005477492 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 14:22:59 np0005477492 dnf[5688]: Metadata cache refreshed recently.
Oct  8 14:22:59 np0005477492 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  8 14:22:59 np0005477492 systemd[1]: Finished dnf makecache.
Oct  8 14:22:59 np0005477492 systemd[1]: Starting PackageKit Daemon...
Oct  8 14:22:59 np0005477492 systemd[1]: Starting Authorization Manager...
Oct  8 14:22:59 np0005477492 polkitd[6442]: Started polkitd version 0.117
Oct  8 14:23:00 np0005477492 systemd[1]: Started Authorization Manager.
Oct  8 14:23:00 np0005477492 systemd[1]: Started PackageKit Daemon.
Oct  8 14:23:00 np0005477492 python3[7220]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-4be5-338d-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:23:01 np0005477492 kernel: evm: overlay not supported
Oct  8 14:23:01 np0005477492 systemd[1173]: Starting D-Bus User Message Bus...
Oct  8 14:23:01 np0005477492 dbus-broker-launch[8339]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct  8 14:23:01 np0005477492 dbus-broker-launch[8339]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct  8 14:23:01 np0005477492 systemd[1173]: Started D-Bus User Message Bus.
Oct  8 14:23:01 np0005477492 dbus-broker-lau[8339]: Ready
Oct  8 14:23:01 np0005477492 systemd[1173]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  8 14:23:01 np0005477492 systemd[1173]: Created slice Slice /user.
Oct  8 14:23:01 np0005477492 systemd[1173]: podman-8196.scope: unit configures an IP firewall, but not running as root.
Oct  8 14:23:01 np0005477492 systemd[1173]: (This warning is only shown for the first unit using IP firewalling.)
Oct  8 14:23:01 np0005477492 systemd[1173]: Started podman-8196.scope.
Oct  8 14:23:01 np0005477492 systemd[1173]: Started podman-pause-cff9be10.scope.
Oct  8 14:23:02 np0005477492 python3[9027]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.144:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.144:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:23:03 np0005477492 systemd[1]: session-4.scope: Deactivated successfully.
Oct  8 14:23:03 np0005477492 systemd[1]: session-4.scope: Consumed 59.743s CPU time.
Oct  8 14:23:03 np0005477492 systemd-logind[786]: Session 4 logged out. Waiting for processes to exit.
Oct  8 14:23:03 np0005477492 systemd-logind[786]: Removed session 4.
Oct  8 14:23:27 np0005477492 systemd-logind[786]: New session 5 of user zuul.
Oct  8 14:23:27 np0005477492 systemd[1]: Started Session 5 of User zuul.
Oct  8 14:23:27 np0005477492 python3[17475]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBmlv1j+aGa8640hU9vjiWVqxTx2rSF/zFEeT0xtGZbhoTj9s/4Q4/74LscxvyeLpVYHPcKvOyj6VplCrllVdwQ= zuul@np0005477491.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:23:28 np0005477492 python3[17604]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBmlv1j+aGa8640hU9vjiWVqxTx2rSF/zFEeT0xtGZbhoTj9s/4Q4/74LscxvyeLpVYHPcKvOyj6VplCrllVdwQ= zuul@np0005477491.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:23:28 np0005477492 python3[17934]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005477492.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct  8 14:23:29 np0005477492 python3[18130]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBmlv1j+aGa8640hU9vjiWVqxTx2rSF/zFEeT0xtGZbhoTj9s/4Q4/74LscxvyeLpVYHPcKvOyj6VplCrllVdwQ= zuul@np0005477491.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  8 14:23:29 np0005477492 python3[18371]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:23:30 np0005477492 python3[18612]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759947809.659891-135-260311347994757/source _original_basename=tmpy4s67ikz follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:23:31 np0005477492 python3[18881]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Oct  8 14:23:31 np0005477492 systemd[1]: Starting Hostname Service...
Oct  8 14:23:31 np0005477492 systemd[1]: Started Hostname Service.
Oct  8 14:23:31 np0005477492 systemd-hostnamed[19017]: Changed pretty hostname to 'compute-0'
Oct  8 14:23:31 np0005477492 systemd-hostnamed[19017]: Hostname set to <compute-0> (static)
Oct  8 14:23:31 np0005477492 NetworkManager[4064]: <info>  [1759947811.4689] hostname: static hostname changed from "np0005477492.novalocal" to "compute-0"
Oct  8 14:23:31 np0005477492 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 14:23:31 np0005477492 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 14:23:31 np0005477492 systemd[1]: session-5.scope: Deactivated successfully.
Oct  8 14:23:31 np0005477492 systemd[1]: session-5.scope: Consumed 2.594s CPU time.
Oct  8 14:23:31 np0005477492 systemd-logind[786]: Session 5 logged out. Waiting for processes to exit.
Oct  8 14:23:31 np0005477492 systemd-logind[786]: Removed session 5.
Oct  8 14:23:41 np0005477492 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 14:23:59 np0005477492 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 14:23:59 np0005477492 systemd[1]: Finished man-db-cache-update.service.
Oct  8 14:23:59 np0005477492 systemd[1]: man-db-cache-update.service: Consumed 1min 3.149s CPU time.
Oct  8 14:23:59 np0005477492 systemd[1]: run-r885d39c242be4a69867203dce77aa0fc.service: Deactivated successfully.
Oct  8 14:24:01 np0005477492 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 14:27:14 np0005477492 systemd-logind[786]: New session 6 of user zuul.
Oct  8 14:27:14 np0005477492 systemd[1]: Started Session 6 of User zuul.
Oct  8 14:27:14 np0005477492 python3[26766]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:27:16 np0005477492 python3[26882]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:27:17 np0005477492 python3[26955]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759948036.0610976-30234-198271197344507/source mode=0755 _original_basename=delorean.repo follow=False checksum=f3f029ef513950de857eede9231def34e37a0d9c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:27:17 np0005477492 python3[26981]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:27:17 np0005477492 python3[27054]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759948036.0610976-30234-198271197344507/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:27:17 np0005477492 python3[27080]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:27:18 np0005477492 python3[27153]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759948036.0610976-30234-198271197344507/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:27:18 np0005477492 python3[27179]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:27:19 np0005477492 python3[27252]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759948036.0610976-30234-198271197344507/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:27:19 np0005477492 python3[27278]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:27:19 np0005477492 python3[27351]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759948036.0610976-30234-198271197344507/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:27:20 np0005477492 python3[27377]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:27:20 np0005477492 python3[27450]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759948036.0610976-30234-198271197344507/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:27:20 np0005477492 python3[27476]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:27:21 np0005477492 python3[27549]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759948036.0610976-30234-198271197344507/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=75ca8f9fe9a538824fd094f239c30e8ce8652e8a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:27:21 np0005477492 python3[27575]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  8 14:27:21 np0005477492 python3[27648]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759948036.0610976-30234-198271197344507/source mode=0755 _original_basename=gating.repo follow=False checksum=f36c1fb7fdad8ba1ec8db350742109a1bbc9d413 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:27:32 np0005477492 python3[27706]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:28:05 np0005477492 systemd[1]: packagekit.service: Deactivated successfully.
Oct  8 14:32:32 np0005477492 systemd[1]: session-6.scope: Deactivated successfully.
Oct  8 14:32:32 np0005477492 systemd[1]: session-6.scope: Consumed 6.317s CPU time.
Oct  8 14:32:32 np0005477492 systemd-logind[786]: Session 6 logged out. Waiting for processes to exit.
Oct  8 14:32:32 np0005477492 systemd-logind[786]: Removed session 6.
Oct  8 14:38:23 np0005477492 systemd-logind[786]: New session 7 of user zuul.
Oct  8 14:38:23 np0005477492 systemd[1]: Started Session 7 of User zuul.
Oct  8 14:38:24 np0005477492 python3.9[27868]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:38:26 np0005477492 python3.9[28049]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:38:34 np0005477492 systemd[1]: session-7.scope: Deactivated successfully.
Oct  8 14:38:34 np0005477492 systemd[1]: session-7.scope: Consumed 8.174s CPU time.
Oct  8 14:38:34 np0005477492 systemd-logind[786]: Session 7 logged out. Waiting for processes to exit.
Oct  8 14:38:34 np0005477492 systemd-logind[786]: Removed session 7.
Oct  8 14:38:39 np0005477492 systemd-logind[786]: New session 8 of user zuul.
Oct  8 14:38:39 np0005477492 systemd[1]: Started Session 8 of User zuul.
Oct  8 14:38:40 np0005477492 python3.9[28259]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:38:41 np0005477492 systemd[1]: session-8.scope: Deactivated successfully.
Oct  8 14:38:41 np0005477492 systemd-logind[786]: Session 8 logged out. Waiting for processes to exit.
Oct  8 14:38:41 np0005477492 systemd-logind[786]: Removed session 8.
Oct  8 14:38:57 np0005477492 systemd-logind[786]: New session 9 of user zuul.
Oct  8 14:38:57 np0005477492 systemd[1]: Started Session 9 of User zuul.
Oct  8 14:38:57 np0005477492 python3.9[28442]: ansible-ansible.legacy.ping Invoked with data=pong
Oct  8 14:38:59 np0005477492 python3.9[28616]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:39:00 np0005477492 python3.9[28768]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:39:01 np0005477492 python3.9[28921]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:39:02 np0005477492 python3.9[29073]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:39:02 np0005477492 python3.9[29226]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:39:03 np0005477492 python3.9[29349]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759948742.3664114-73-107599708268539/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:39:04 np0005477492 python3.9[29501]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:39:05 np0005477492 python3.9[29657]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:39:06 np0005477492 python3.9[29807]: ansible-ansible.builtin.service_facts Invoked
Oct  8 14:39:11 np0005477492 python3.9[30062]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:39:12 np0005477492 python3.9[30212]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:39:13 np0005477492 python3.9[30366]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:39:14 np0005477492 python3.9[30524]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:39:15 np0005477492 python3.9[30608]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:39:59 np0005477492 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Oct  8 14:39:59 np0005477492 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Oct  8 14:39:59 np0005477492 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Oct  8 14:39:59 np0005477492 dbus-broker-launch[8339]: Noticed file-system modification, trigger reload.
Oct  8 14:39:59 np0005477492 dbus-broker-launch[8339]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct  8 14:39:59 np0005477492 dbus-broker-launch[8339]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct  8 14:39:59 np0005477492 systemd[1]: Reexecuting.
Oct  8 14:39:59 np0005477492 systemd: systemd 252-57.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  8 14:39:59 np0005477492 systemd: Detected virtualization kvm.
Oct  8 14:39:59 np0005477492 systemd: Detected architecture x86-64.
Oct  8 14:39:59 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:39:59 np0005477492 systemd[1]: Reloading.
Oct  8 14:40:00 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:40:00 np0005477492 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct  8 14:40:01 np0005477492 systemd[1]: Reloading.
Oct  8 14:40:01 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:40:01 np0005477492 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct  8 14:40:01 np0005477492 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct  8 14:40:01 np0005477492 systemd[1]: Reloading.
Oct  8 14:40:01 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:40:02 np0005477492 systemd[1]: Listening on LVM2 poll daemon socket.
Oct  8 14:40:02 np0005477492 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Oct  8 14:40:02 np0005477492 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Oct  8 14:40:02 np0005477492 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Oct  8 14:41:10 np0005477492 kernel: SELinux:  Converting 2714 SID table entries...
Oct  8 14:41:10 np0005477492 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 14:41:10 np0005477492 kernel: SELinux:  policy capability open_perms=1
Oct  8 14:41:10 np0005477492 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 14:41:10 np0005477492 kernel: SELinux:  policy capability always_check_network=0
Oct  8 14:41:10 np0005477492 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 14:41:10 np0005477492 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 14:41:10 np0005477492 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 14:41:10 np0005477492 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct  8 14:41:11 np0005477492 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 14:41:11 np0005477492 systemd[1]: Starting man-db-cache-update.service...
Oct  8 14:41:11 np0005477492 systemd[1]: Reloading.
Oct  8 14:41:11 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:41:11 np0005477492 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 14:41:11 np0005477492 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 14:41:11 np0005477492 systemd-journald[677]: Journal stopped
Oct  8 14:41:11 np0005477492 systemd-journald: Received SIGTERM from PID 1 (systemd).
Oct  8 14:41:11 np0005477492 systemd: Stopping Journal Service...
Oct  8 14:41:11 np0005477492 systemd: Stopping Rule-based Manager for Device Events and Files...
Oct  8 14:41:11 np0005477492 systemd: systemd-journald.service: Deactivated successfully.
Oct  8 14:41:11 np0005477492 systemd: Stopped Journal Service.
Oct  8 14:41:11 np0005477492 systemd: Starting Journal Service...
Oct  8 14:41:11 np0005477492 systemd: systemd-udevd.service: Deactivated successfully.
Oct  8 14:41:11 np0005477492 systemd: Stopped Rule-based Manager for Device Events and Files.
Oct  8 14:41:11 np0005477492 systemd: systemd-udevd.service: Consumed 2.163s CPU time.
Oct  8 14:41:11 np0005477492 systemd: Starting Rule-based Manager for Device Events and Files...
Oct  8 14:41:11 np0005477492 systemd-journald[31335]: Journal started
Oct  8 14:41:11 np0005477492 systemd-journald[31335]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  8 14:41:11 np0005477492 systemd: Started Journal Service.
Oct  8 14:41:11 np0005477492 systemd-udevd[31345]: Using default interface naming scheme 'rhel-9.0'.
Oct  8 14:41:11 np0005477492 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  8 14:41:12 np0005477492 systemd[1]: Reloading.
Oct  8 14:41:12 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:41:12 np0005477492 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 14:41:13 np0005477492 systemd[1]: Starting PackageKit Daemon...
Oct  8 14:41:13 np0005477492 systemd[1]: Started PackageKit Daemon.
Oct  8 14:41:14 np0005477492 python3.9[33543]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:41:16 np0005477492 python3.9[35574]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct  8 14:41:17 np0005477492 python3.9[36625]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct  8 14:41:22 np0005477492 python3.9[38244]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:41:23 np0005477492 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 14:41:23 np0005477492 systemd[1]: Finished man-db-cache-update.service.
Oct  8 14:41:23 np0005477492 systemd[1]: man-db-cache-update.service: Consumed 12.614s CPU time.
Oct  8 14:41:23 np0005477492 systemd[1]: run-rfa6cbfdf4191424db4218d293089c3ed.service: Deactivated successfully.
Oct  8 14:41:23 np0005477492 systemd[1]: run-rbd46bfe85d6f48688228b2023021287f.service: Deactivated successfully.
Oct  8 14:41:23 np0005477492 python3.9[39892]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct  8 14:41:24 np0005477492 irqbalance[781]: Cannot change IRQ 26 affinity: Operation not permitted
Oct  8 14:41:24 np0005477492 irqbalance[781]: IRQ 26 affinity is now unmanaged
Oct  8 14:41:25 np0005477492 python3.9[40045]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:41:25 np0005477492 python3.9[40197]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:41:26 np0005477492 python3.9[40320]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759948885.3515654-227-66795712115742/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:41:27 np0005477492 python3.9[40472]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct  8 14:41:28 np0005477492 python3.9[40625]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 14:41:28 np0005477492 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 14:41:28 np0005477492 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 14:41:30 np0005477492 python3.9[40784]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 14:41:30 np0005477492 python3.9[40944]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct  8 14:41:31 np0005477492 python3.9[41097]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 14:41:32 np0005477492 python3.9[41255]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct  8 14:41:33 np0005477492 python3.9[41407]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:41:35 np0005477492 python3.9[41560]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:41:36 np0005477492 python3.9[41712]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:41:37 np0005477492 python3.9[41835]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759948896.016033-322-162220807073682/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:41:38 np0005477492 python3.9[41987]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 14:41:38 np0005477492 systemd[1]: Starting Load Kernel Modules...
Oct  8 14:41:38 np0005477492 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct  8 14:41:38 np0005477492 kernel: Bridge firewalling registered
Oct  8 14:41:38 np0005477492 systemd-modules-load[41991]: Inserted module 'br_netfilter'
Oct  8 14:41:38 np0005477492 systemd[1]: Finished Load Kernel Modules.
Oct  8 14:41:39 np0005477492 python3.9[42146]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:41:40 np0005477492 python3.9[42269]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759948898.9973867-345-175722957994491/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:41:41 np0005477492 python3.9[42421]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:41:44 np0005477492 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Oct  8 14:41:44 np0005477492 dbus-broker-launch[743]: Noticed file-system modification, trigger reload.
Oct  8 14:41:44 np0005477492 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 14:41:44 np0005477492 systemd[1]: Starting man-db-cache-update.service...
Oct  8 14:41:44 np0005477492 systemd[1]: Reloading.
Oct  8 14:41:44 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:41:44 np0005477492 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 14:41:46 np0005477492 python3.9[43728]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:41:47 np0005477492 python3.9[44563]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct  8 14:41:47 np0005477492 python3.9[45302]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:41:48 np0005477492 python3.9[46123]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:41:48 np0005477492 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  8 14:41:48 np0005477492 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 14:41:48 np0005477492 systemd[1]: Finished man-db-cache-update.service.
Oct  8 14:41:48 np0005477492 systemd[1]: man-db-cache-update.service: Consumed 5.492s CPU time.
Oct  8 14:41:48 np0005477492 systemd[1]: run-r0c75d9b007074251ab2e73f725a4b9d4.service: Deactivated successfully.
Oct  8 14:41:49 np0005477492 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  8 14:41:50 np0005477492 python3.9[46808]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:41:50 np0005477492 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct  8 14:41:50 np0005477492 systemd[1]: tuned.service: Deactivated successfully.
Oct  8 14:41:50 np0005477492 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct  8 14:41:50 np0005477492 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  8 14:41:50 np0005477492 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  8 14:41:51 np0005477492 python3.9[46969]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct  8 14:41:53 np0005477492 python3.9[47121]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:41:53 np0005477492 systemd[1]: Reloading.
Oct  8 14:41:54 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:41:54 np0005477492 python3.9[47310]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:41:55 np0005477492 systemd[1]: Reloading.
Oct  8 14:41:55 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:41:56 np0005477492 python3.9[47499]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:41:56 np0005477492 python3.9[47652]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:41:56 np0005477492 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct  8 14:41:57 np0005477492 python3.9[47805]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:41:59 np0005477492 python3.9[47967]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:42:00 np0005477492 python3.9[48120]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 14:42:00 np0005477492 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  8 14:42:00 np0005477492 systemd[1]: Stopped Apply Kernel Variables.
Oct  8 14:42:00 np0005477492 systemd[1]: Stopping Apply Kernel Variables...
Oct  8 14:42:00 np0005477492 systemd[1]: Starting Apply Kernel Variables...
Oct  8 14:42:00 np0005477492 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  8 14:42:00 np0005477492 systemd[1]: Finished Apply Kernel Variables.
Oct  8 14:42:01 np0005477492 systemd[1]: session-9.scope: Deactivated successfully.
Oct  8 14:42:01 np0005477492 systemd[1]: session-9.scope: Consumed 2min 27.403s CPU time.
Oct  8 14:42:01 np0005477492 systemd-logind[786]: Session 9 logged out. Waiting for processes to exit.
Oct  8 14:42:01 np0005477492 systemd-logind[786]: Removed session 9.
Oct  8 14:42:07 np0005477492 systemd-logind[786]: New session 10 of user zuul.
Oct  8 14:42:07 np0005477492 systemd[1]: Started Session 10 of User zuul.
Oct  8 14:42:08 np0005477492 python3.9[48303]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:42:09 np0005477492 python3.9[48457]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:42:10 np0005477492 python3.9[48613]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:42:11 np0005477492 python3.9[48764]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:42:12 np0005477492 python3.9[48920]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:42:13 np0005477492 python3.9[49004]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:42:15 np0005477492 python3.9[49157]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:42:16 np0005477492 python3.9[49328]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:42:17 np0005477492 python3.9[49480]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:42:17 np0005477492 systemd[1]: var-lib-containers-storage-overlay-compat4058148395-merged.mount: Deactivated successfully.
Oct  8 14:42:17 np0005477492 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck1570085433-merged.mount: Deactivated successfully.
Oct  8 14:42:17 np0005477492 podman[49481]: 2025-10-08 18:42:17.776526286 +0000 UTC m=+0.081659136 system refresh
Oct  8 14:42:18 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:42:18 np0005477492 python3.9[49644]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:42:19 np0005477492 python3.9[49767]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759948938.0029533-109-193826332996096/.source.json follow=False _original_basename=podman_network_config.j2 checksum=9767a62541e27415326795ae6f1da9596d558e9e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:42:20 np0005477492 python3.9[49919]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:42:21 np0005477492 python3.9[50042]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759948939.8142407-124-183908076963244/.source.conf follow=False _original_basename=registries.conf.j2 checksum=e15dcb19216ab36551f9f97c300e7ebe1c03c80c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:42:22 np0005477492 python3.9[50194]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:42:22 np0005477492 python3.9[50346]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:42:23 np0005477492 python3.9[50498]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:42:24 np0005477492 python3.9[50650]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:42:25 np0005477492 python3.9[50800]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:42:26 np0005477492 python3.9[50954]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:42:28 np0005477492 python3.9[51107]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:42:30 np0005477492 python3.9[51267]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:42:32 np0005477492 python3.9[51420]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:42:34 np0005477492 python3.9[51573]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:42:37 np0005477492 python3.9[51729]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:42:41 np0005477492 python3.9[51897]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:42:43 np0005477492 python3.9[52050]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:43:00 np0005477492 python3.9[52389]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:43:01 np0005477492 python3.9[52564]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:43:01 np0005477492 python3.9[52687]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759948980.5728247-263-204904035336521/.source.json _original_basename=.l49bcobp follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:43:02 np0005477492 python3.9[52839]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 14:43:03 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:05 np0005477492 systemd[1]: var-lib-containers-storage-overlay-compat360510889-lower\x2dmapped.mount: Deactivated successfully.
Oct  8 14:43:10 np0005477492 podman[52851]: 2025-10-08 18:43:10.05383283 +0000 UTC m=+7.013998763 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct  8 14:43:10 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:10 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:10 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:11 np0005477492 python3.9[53152]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 14:43:11 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:13 np0005477492 podman[53166]: 2025-10-08 18:43:13.769632606 +0000 UTC m=+2.407669785 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct  8 14:43:13 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:13 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:13 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:15 np0005477492 python3.9[53424]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 14:43:15 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:23 np0005477492 podman[53436]: 2025-10-08 18:43:23.129979782 +0000 UTC m=+8.045943084 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 14:43:23 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:23 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:23 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:24 np0005477492 python3.9[53710]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 14:43:24 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:25 np0005477492 podman[53724]: 2025-10-08 18:43:25.585537482 +0000 UTC m=+1.280835940 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct  8 14:43:25 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:25 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:25 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:26 np0005477492 python3.9[53959]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 14:43:26 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:36 np0005477492 podman[53972]: 2025-10-08 18:43:36.11959631 +0000 UTC m=+9.349066744 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct  8 14:43:36 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:36 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:36 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:37 np0005477492 python3.9[54231]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 14:43:37 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:40 np0005477492 podman[54244]: 2025-10-08 18:43:40.929030878 +0000 UTC m=+3.636675445 image pull 5397cd841d80292a5786d82cb8a2bcd574988efb08c605ba6eaaa59d6f646815 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189
Oct  8 14:43:40 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:40 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:41 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:41 np0005477492 python3.9[54496]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  8 14:43:41 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:43 np0005477492 podman[54508]: 2025-10-08 18:43:43.298853317 +0000 UTC m=+1.335457079 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Oct  8 14:43:43 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:43 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:43 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:43:44 np0005477492 systemd[1]: session-10.scope: Deactivated successfully.
Oct  8 14:43:44 np0005477492 systemd[1]: session-10.scope: Consumed 1min 57.927s CPU time.
Oct  8 14:43:44 np0005477492 systemd-logind[786]: Session 10 logged out. Waiting for processes to exit.
Oct  8 14:43:44 np0005477492 systemd-logind[786]: Removed session 10.
Oct  8 14:43:49 np0005477492 systemd-logind[786]: New session 11 of user zuul.
Oct  8 14:43:49 np0005477492 systemd[1]: Started Session 11 of User zuul.
Oct  8 14:43:50 np0005477492 python3.9[54812]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:43:51 np0005477492 python3.9[54968]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct  8 14:43:52 np0005477492 python3.9[55121]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 14:43:53 np0005477492 python3.9[55279]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 14:43:54 np0005477492 python3.9[55439]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:43:55 np0005477492 python3.9[55523]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:43:58 np0005477492 python3.9[55684]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:44:10 np0005477492 kernel: SELinux:  Converting 2726 SID table entries...
Oct  8 14:44:10 np0005477492 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 14:44:10 np0005477492 kernel: SELinux:  policy capability open_perms=1
Oct  8 14:44:10 np0005477492 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 14:44:10 np0005477492 kernel: SELinux:  policy capability always_check_network=0
Oct  8 14:44:10 np0005477492 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 14:44:10 np0005477492 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 14:44:10 np0005477492 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 14:44:10 np0005477492 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct  8 14:44:10 np0005477492 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct  8 14:44:12 np0005477492 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 14:44:12 np0005477492 systemd[1]: Starting man-db-cache-update.service...
Oct  8 14:44:12 np0005477492 systemd[1]: Reloading.
Oct  8 14:44:12 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:44:12 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:44:12 np0005477492 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 14:44:13 np0005477492 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 14:44:13 np0005477492 systemd[1]: Finished man-db-cache-update.service.
Oct  8 14:44:13 np0005477492 systemd[1]: man-db-cache-update.service: Consumed 1.074s CPU time.
Oct  8 14:44:13 np0005477492 systemd[1]: run-rc0265a27346c43539443a10b8fa98cde.service: Deactivated successfully.
Oct  8 14:44:14 np0005477492 python3.9[56786]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 14:44:14 np0005477492 systemd[1]: Reloading.
Oct  8 14:44:14 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:44:14 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:44:14 np0005477492 systemd[1]: Starting Open vSwitch Database Unit...
Oct  8 14:44:14 np0005477492 chown[56828]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct  8 14:44:15 np0005477492 ovs-ctl[56833]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct  8 14:44:15 np0005477492 ovs-ctl[56833]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct  8 14:44:15 np0005477492 ovs-ctl[56833]: Starting ovsdb-server [  OK  ]
Oct  8 14:44:15 np0005477492 ovs-vsctl[56882]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  8 14:44:15 np0005477492 ovs-vsctl[56902]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"47f81f7a-64d8-418a-a74c-b879bd6deb83\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  8 14:44:15 np0005477492 ovs-ctl[56833]: Configuring Open vSwitch system IDs [  OK  ]
Oct  8 14:44:15 np0005477492 ovs-ctl[56833]: Enabling remote OVSDB managers [  OK  ]
Oct  8 14:44:15 np0005477492 ovs-vsctl[56908]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct  8 14:44:15 np0005477492 systemd[1]: Started Open vSwitch Database Unit.
Oct  8 14:44:15 np0005477492 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct  8 14:44:15 np0005477492 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct  8 14:44:15 np0005477492 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  8 14:44:15 np0005477492 kernel: openvswitch: Open vSwitch switching datapath
Oct  8 14:44:15 np0005477492 ovs-ctl[56952]: Inserting openvswitch module [  OK  ]
Oct  8 14:44:15 np0005477492 ovs-ctl[56921]: Starting ovs-vswitchd [  OK  ]
Oct  8 14:44:15 np0005477492 ovs-vsctl[56973]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct  8 14:44:15 np0005477492 ovs-ctl[56921]: Enabling remote OVSDB managers [  OK  ]
Oct  8 14:44:15 np0005477492 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  8 14:44:15 np0005477492 systemd[1]: Starting Open vSwitch...
Oct  8 14:44:15 np0005477492 systemd[1]: Finished Open vSwitch.
Oct  8 14:44:16 np0005477492 python3.9[57124]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:44:17 np0005477492 python3.9[57276]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct  8 14:44:18 np0005477492 kernel: SELinux:  Converting 2740 SID table entries...
Oct  8 14:44:19 np0005477492 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 14:44:19 np0005477492 kernel: SELinux:  policy capability open_perms=1
Oct  8 14:44:19 np0005477492 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 14:44:19 np0005477492 kernel: SELinux:  policy capability always_check_network=0
Oct  8 14:44:19 np0005477492 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 14:44:19 np0005477492 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 14:44:19 np0005477492 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 14:44:20 np0005477492 python3.9[57431]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:44:20 np0005477492 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct  8 14:44:21 np0005477492 python3.9[57589]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:44:23 np0005477492 python3.9[57744]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:44:24 np0005477492 python3.9[58031]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  8 14:44:25 np0005477492 python3.9[58181]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:44:26 np0005477492 python3.9[58335]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:44:28 np0005477492 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 14:44:28 np0005477492 systemd[1]: Starting man-db-cache-update.service...
Oct  8 14:44:28 np0005477492 systemd[1]: Reloading.
Oct  8 14:44:28 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:44:28 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:44:28 np0005477492 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 14:44:28 np0005477492 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 14:44:28 np0005477492 systemd[1]: Finished man-db-cache-update.service.
Oct  8 14:44:28 np0005477492 systemd[1]: run-rff1b94dca8bd452cb297c514060969c2.service: Deactivated successfully.
Oct  8 14:44:29 np0005477492 python3.9[58654]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 14:44:29 np0005477492 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  8 14:44:29 np0005477492 systemd[1]: Stopped Network Manager Wait Online.
Oct  8 14:44:29 np0005477492 systemd[1]: Stopping Network Manager Wait Online...
Oct  8 14:44:29 np0005477492 systemd[1]: Stopping Network Manager...
Oct  8 14:44:29 np0005477492 NetworkManager[4064]: <info>  [1759949069.4886] caught SIGTERM, shutting down normally.
Oct  8 14:44:29 np0005477492 NetworkManager[4064]: <info>  [1759949069.4899] dhcp4 (eth0): canceled DHCP transaction
Oct  8 14:44:29 np0005477492 NetworkManager[4064]: <info>  [1759949069.4899] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 14:44:29 np0005477492 NetworkManager[4064]: <info>  [1759949069.4899] dhcp4 (eth0): state changed no lease
Oct  8 14:44:29 np0005477492 NetworkManager[4064]: <info>  [1759949069.4901] manager: NetworkManager state is now CONNECTED_SITE
Oct  8 14:44:29 np0005477492 NetworkManager[4064]: <info>  [1759949069.4957] exiting (success)
Oct  8 14:44:29 np0005477492 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 14:44:29 np0005477492 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 14:44:29 np0005477492 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  8 14:44:29 np0005477492 systemd[1]: Stopped Network Manager.
Oct  8 14:44:29 np0005477492 systemd[1]: NetworkManager.service: Consumed 9.581s CPU time, 4.2M memory peak, read 0B from disk, written 13.5K to disk.
Oct  8 14:44:29 np0005477492 systemd[1]: Starting Network Manager...
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.5732] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:cb0c00bd-184a-4765-8005-06a5fc6550cb)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.5735] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.5797] manager[0x55a3521ed090]: monitoring kernel firmware directory '/lib/firmware'.
Oct  8 14:44:29 np0005477492 systemd[1]: Starting Hostname Service...
Oct  8 14:44:29 np0005477492 systemd[1]: Started Hostname Service.
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.6986] hostname: hostname: using hostnamed
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.6987] hostname: static hostname changed from (none) to "compute-0"
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.6997] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7005] manager[0x55a3521ed090]: rfkill: Wi-Fi hardware radio set enabled
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7006] manager[0x55a3521ed090]: rfkill: WWAN hardware radio set enabled
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7042] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7057] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7059] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7060] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7061] manager: Networking is enabled by state file
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7066] settings: Loaded settings plugin: keyfile (internal)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7071] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7137] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7149] dhcp: init: Using DHCP client 'internal'
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7151] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7158] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7166] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7175] device (lo): Activation: starting connection 'lo' (da957721-2ac9-44f5-bcd8-228e504809c9)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7183] device (eth0): carrier: link connected
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7187] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7194] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7194] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7201] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7210] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7216] device (eth1): carrier: link connected
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7219] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7223] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0) (indicated)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7224] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7228] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7234] device (eth1): Activation: starting connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct  8 14:44:29 np0005477492 systemd[1]: Started Network Manager.
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7244] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7251] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7254] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7256] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7257] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7260] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7262] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7264] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7266] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7270] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7272] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7281] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7316] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7324] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7326] dhcp4 (eth0): state changed new lease, address=38.102.83.120
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7329] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7333] device (lo): Activation: successful, device activated.
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7346] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  8 14:44:29 np0005477492 systemd[1]: Starting Network Manager Wait Online...
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7412] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7419] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7421] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7425] manager: NetworkManager state is now CONNECTED_LOCAL
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7430] device (eth1): Activation: successful, device activated.
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7441] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7443] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7446] manager: NetworkManager state is now CONNECTED_SITE
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7452] device (eth0): Activation: successful, device activated.
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7455] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  8 14:44:29 np0005477492 NetworkManager[58665]: <info>  [1759949069.7457] manager: startup complete
Oct  8 14:44:29 np0005477492 systemd[1]: Finished Network Manager Wait Online.
Oct  8 14:44:30 np0005477492 python3.9[58880]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:44:35 np0005477492 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 14:44:35 np0005477492 systemd[1]: Starting man-db-cache-update.service...
Oct  8 14:44:35 np0005477492 systemd[1]: Reloading.
Oct  8 14:44:35 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:44:35 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:44:36 np0005477492 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 14:44:36 np0005477492 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 14:44:36 np0005477492 systemd[1]: Finished man-db-cache-update.service.
Oct  8 14:44:36 np0005477492 systemd[1]: run-r21aa8b6197d3406da6e61f287a62c5bc.service: Deactivated successfully.
Oct  8 14:44:37 np0005477492 python3.9[59340]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:44:38 np0005477492 python3.9[59492]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:39 np0005477492 python3.9[59646]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:39 np0005477492 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 14:44:40 np0005477492 python3.9[59798]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:40 np0005477492 python3.9[59950]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:41 np0005477492 python3.9[60102]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:42 np0005477492 python3.9[60254]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:44:43 np0005477492 python3.9[60377]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949081.8804777-229-9144220621120/.source _original_basename=._em82fxv follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:44 np0005477492 python3.9[60529]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:44 np0005477492 python3.9[60681]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct  8 14:44:45 np0005477492 python3.9[60833]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:48 np0005477492 python3.9[61260]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct  8 14:44:49 np0005477492 ansible-async_wrapper.py[61435]: Invoked with j624244218040 300 /home/zuul/.ansible/tmp/ansible-tmp-1759949088.4709277-295-56361785687098/AnsiballZ_edpm_os_net_config.py _
Oct  8 14:44:49 np0005477492 ansible-async_wrapper.py[61438]: Starting module and watcher
Oct  8 14:44:49 np0005477492 ansible-async_wrapper.py[61438]: Start watching 61439 (300)
Oct  8 14:44:49 np0005477492 ansible-async_wrapper.py[61439]: Start module (61439)
Oct  8 14:44:49 np0005477492 ansible-async_wrapper.py[61435]: Return async_wrapper task started.
Oct  8 14:44:49 np0005477492 python3.9[61440]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct  8 14:44:50 np0005477492 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct  8 14:44:50 np0005477492 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct  8 14:44:50 np0005477492 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct  8 14:44:50 np0005477492 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct  8 14:44:50 np0005477492 kernel: cfg80211: failed to load regulatory.db
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.7537] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.7562] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8520] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8524] audit: op="connection-add" uuid="e5fe6336-0393-4f39-89c9-10707afd900f" name="br-ex-br" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8550] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8555] audit: op="connection-add" uuid="8546a6a0-ff01-4e77-9a80-0ccb84b09e15" name="br-ex-port" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8577] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8581] audit: op="connection-add" uuid="19cf4192-6bcb-444b-ae5b-b65ed7eb80f5" name="eth1-port" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8605] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8608] audit: op="connection-add" uuid="250e055d-e2d6-47d8-a97e-ae3fcd2ad51e" name="vlan20-port" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8630] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8635] audit: op="connection-add" uuid="ae6c2a0f-6e02-482e-bde7-7c80cfee7790" name="vlan21-port" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8657] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8661] audit: op="connection-add" uuid="f99e4b34-6f01-46fc-9977-6ae5bcc3a7ce" name="vlan22-port" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8695] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8725] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8729] audit: op="connection-add" uuid="794f8bbe-0f95-43e1-a59b-5efcb30fbf56" name="br-ex-if" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8799] audit: op="connection-update" uuid="f659475b-7c6f-5319-b371-519bc515c6f0" name="ci-private-network" args="connection.timestamp,connection.slave-type,connection.controller,connection.master,connection.port-type,ipv4.dns,ipv4.routing-rules,ipv4.routes,ipv4.addresses,ipv4.method,ipv4.never-default,ipv6.dns,ipv6.routing-rules,ipv6.routes,ipv6.addr-gen-mode,ipv6.addresses,ipv6.method,ovs-external-ids.data,ovs-interface.type" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8828] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8831] audit: op="connection-add" uuid="7ec5abe9-78ff-4cf6-b429-5715d559f836" name="vlan20-if" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8859] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8862] audit: op="connection-add" uuid="de041523-e9cd-491b-892a-fc68198a2acd" name="vlan21-if" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8891] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8895] audit: op="connection-add" uuid="52d34180-b43f-4c96-a950-d37fc7c59cd0" name="vlan22-if" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8914] audit: op="connection-delete" uuid="8bdaa810-1639-3a4a-8290-5b06515ee0ed" name="Wired connection 1" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8935] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8951] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8958] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (e5fe6336-0393-4f39-89c9-10707afd900f)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8960] audit: op="connection-activate" uuid="e5fe6336-0393-4f39-89c9-10707afd900f" name="br-ex-br" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8964] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8975] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8981] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (8546a6a0-ff01-4e77-9a80-0ccb84b09e15)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8986] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.8995] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9002] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (19cf4192-6bcb-444b-ae5b-b65ed7eb80f5)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9009] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9024] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9031] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (250e055d-e2d6-47d8-a97e-ae3fcd2ad51e)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9034] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9046] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9052] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ae6c2a0f-6e02-482e-bde7-7c80cfee7790)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9056] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9068] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9075] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (f99e4b34-6f01-46fc-9977-6ae5bcc3a7ce)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9076] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9080] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9083] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9094] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9102] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9109] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (794f8bbe-0f95-43e1-a59b-5efcb30fbf56)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9111] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9116] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9119] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9121] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9123] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9142] device (eth1): disconnecting for new activation request.
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9143] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9149] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9151] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9154] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9158] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9165] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9172] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (7ec5abe9-78ff-4cf6-b429-5715d559f836)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9174] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9178] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9181] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9183] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9188] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9198] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9207] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (de041523-e9cd-491b-892a-fc68198a2acd)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9208] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9212] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9216] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9217] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9222] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9233] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9241] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (52d34180-b43f-4c96-a950-d37fc7c59cd0)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9242] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9249] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9253] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9255] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9258] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9282] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu,ipv6.addr-gen-mode,ipv6.method" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9288] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9294] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9297] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9309] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9316] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9322] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9328] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 kernel: ovs-system: entered promiscuous mode
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9353] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9362] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 systemd-udevd[61446]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 14:44:51 np0005477492 kernel: Timeout policy base is empty
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9370] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9384] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9388] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9400] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9408] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9415] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9419] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9430] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9441] dhcp4 (eth0): canceled DHCP transaction
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9442] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9444] dhcp4 (eth0): state changed no lease
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9449] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9480] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9488] audit: op="device-reapply" interface="eth1" ifindex=3 pid=61441 uid=0 result="fail" reason="Device is not activated"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9541] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9551] dhcp4 (eth0): state changed new lease, address=38.102.83.120
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9561] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9617] device (eth1): disconnecting for new activation request.
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9618] audit: op="connection-activate" uuid="f659475b-7c6f-5319-b371-519bc515c6f0" name="ci-private-network" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9625] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9633] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9797] device (eth1): Activation: starting connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9808] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9838] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9851] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 kernel: br-ex: entered promiscuous mode
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9866] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9878] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9890] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=61441 uid=0 result="success"
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9892] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9898] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9905] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9907] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9910] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9937] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9954] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9964] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9972] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:51 np0005477492 NetworkManager[58665]: <info>  [1759949091.9991] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0002] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0018] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0035] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0052] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0062] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0080] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0096] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0106] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0152] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct  8 14:44:52 np0005477492 kernel: vlan22: entered promiscuous mode
Oct  8 14:44:52 np0005477492 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0236] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0254] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0263] device (eth1): Activation: successful, device activated.
Oct  8 14:44:52 np0005477492 kernel: vlan20: entered promiscuous mode
Oct  8 14:44:52 np0005477492 systemd-udevd[61445]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0296] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0350] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0352] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0364] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0384] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0419] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0427] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 kernel: vlan21: entered promiscuous mode
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0495] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0516] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0526] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0536] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0546] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0550] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0561] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0687] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0705] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0731] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0733] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 14:44:52 np0005477492 NetworkManager[58665]: <info>  [1759949092.0740] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.2511] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=61441 uid=0 result="success"
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.4616] checkpoint[0x55a3521c3950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.4621] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=61441 uid=0 result="success"
Oct  8 14:44:53 np0005477492 python3.9[61773]: ansible-ansible.legacy.async_status Invoked with jid=j624244218040.61435 mode=status _async_dir=/root/.ansible_async
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.7130] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=61441 uid=0 result="success"
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.7144] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=61441 uid=0 result="success"
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.9256] audit: op="networking-control" arg="global-dns-configuration" pid=61441 uid=0 result="success"
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.9290] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.9332] audit: op="networking-control" arg="global-dns-configuration" pid=61441 uid=0 result="success"
Oct  8 14:44:53 np0005477492 NetworkManager[58665]: <info>  [1759949093.9368] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=61441 uid=0 result="success"
Oct  8 14:44:54 np0005477492 NetworkManager[58665]: <info>  [1759949094.1642] checkpoint[0x55a3521c3a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct  8 14:44:54 np0005477492 NetworkManager[58665]: <info>  [1759949094.1647] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=61441 uid=0 result="success"
Oct  8 14:44:54 np0005477492 ansible-async_wrapper.py[61439]: Module complete (61439)
Oct  8 14:44:54 np0005477492 ansible-async_wrapper.py[61438]: Done in kid B.
Oct  8 14:44:57 np0005477492 python3.9[61880]: ansible-ansible.legacy.async_status Invoked with jid=j624244218040.61435 mode=status _async_dir=/root/.ansible_async
Oct  8 14:44:57 np0005477492 python3.9[61979]: ansible-ansible.legacy.async_status Invoked with jid=j624244218040.61435 mode=cleanup _async_dir=/root/.ansible_async
Oct  8 14:44:58 np0005477492 python3.9[62131]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:44:59 np0005477492 python3.9[62254]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949098.0465724-322-143614679258900/.source.returncode _original_basename=.hh10x_80 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:44:59 np0005477492 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 14:44:59 np0005477492 python3.9[62408]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:45:00 np0005477492 python3.9[62532]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949099.4553628-338-73782501492403/.source.cfg _original_basename=.9tvfm9oh follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:45:01 np0005477492 python3.9[62684]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 14:45:01 np0005477492 systemd[1]: Reloading Network Manager...
Oct  8 14:45:01 np0005477492 NetworkManager[58665]: <info>  [1759949101.4490] audit: op="reload" arg="0" pid=62688 uid=0 result="success"
Oct  8 14:45:01 np0005477492 NetworkManager[58665]: <info>  [1759949101.4498] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct  8 14:45:01 np0005477492 systemd[1]: Reloaded Network Manager.
Oct  8 14:45:01 np0005477492 systemd[1]: session-11.scope: Deactivated successfully.
Oct  8 14:45:01 np0005477492 systemd[1]: session-11.scope: Consumed 53.687s CPU time.
Oct  8 14:45:01 np0005477492 systemd-logind[786]: Session 11 logged out. Waiting for processes to exit.
Oct  8 14:45:01 np0005477492 systemd-logind[786]: Removed session 11.
Oct  8 14:45:06 np0005477492 systemd-logind[786]: New session 12 of user zuul.
Oct  8 14:45:06 np0005477492 systemd[1]: Started Session 12 of User zuul.
Oct  8 14:45:07 np0005477492 python3.9[62872]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:45:09 np0005477492 python3.9[63026]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:45:10 np0005477492 python3.9[63216]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:45:10 np0005477492 systemd-logind[786]: Session 12 logged out. Waiting for processes to exit.
Oct  8 14:45:10 np0005477492 systemd[1]: session-12.scope: Deactivated successfully.
Oct  8 14:45:10 np0005477492 systemd[1]: session-12.scope: Consumed 2.721s CPU time.
Oct  8 14:45:10 np0005477492 systemd-logind[786]: Removed session 12.
Oct  8 14:45:11 np0005477492 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 14:45:15 np0005477492 systemd-logind[786]: New session 13 of user zuul.
Oct  8 14:45:15 np0005477492 systemd[1]: Started Session 13 of User zuul.
Oct  8 14:45:17 np0005477492 python3.9[63398]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:45:18 np0005477492 python3.9[63552]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:45:19 np0005477492 python3.9[63708]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:45:20 np0005477492 python3.9[63793]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:45:22 np0005477492 python3.9[63946]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:45:23 np0005477492 python3.9[64138]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:45:24 np0005477492 python3.9[64290]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:45:24 np0005477492 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 14:45:25 np0005477492 python3.9[64453]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:45:26 np0005477492 python3.9[64531]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:45:26 np0005477492 python3.9[64683]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:45:27 np0005477492 python3.9[64761]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:45:28 np0005477492 python3.9[64913]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:45:29 np0005477492 python3.9[65065]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:45:29 np0005477492 python3.9[65217]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:45:30 np0005477492 python3.9[65369]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:45:31 np0005477492 python3.9[65521]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:45:33 np0005477492 python3.9[65674]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:45:34 np0005477492 python3.9[65828]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:45:35 np0005477492 python3.9[65980]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:45:36 np0005477492 python3.9[66132]: ansible-service_facts Invoked
Oct  8 14:45:36 np0005477492 network[66149]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 14:45:36 np0005477492 network[66150]: 'network-scripts' will be removed from distribution in near future.
Oct  8 14:45:36 np0005477492 network[66151]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 14:45:42 np0005477492 python3.9[66605]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 14:45:44 np0005477492 python3.9[66758]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct  8 14:45:45 np0005477492 python3.9[66910]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:45:46 np0005477492 python3.9[67035]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949145.1931963-220-54850447942380/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:45:47 np0005477492 python3.9[67189]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:45:48 np0005477492 python3.9[67314]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949146.8067887-235-144175481916916/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:45:49 np0005477492 python3.9[67468]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:45:50 np0005477492 python3.9[67622]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:45:51 np0005477492 python3.9[67706]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:45:52 np0005477492 python3.9[67860]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:45:53 np0005477492 python3.9[67944]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 14:45:53 np0005477492 chronyd[794]: chronyd exiting
Oct  8 14:45:53 np0005477492 systemd[1]: Stopping NTP client/server...
Oct  8 14:45:53 np0005477492 systemd[1]: chronyd.service: Deactivated successfully.
Oct  8 14:45:53 np0005477492 systemd[1]: Stopped NTP client/server.
Oct  8 14:45:53 np0005477492 systemd[1]: Starting NTP client/server...
Oct  8 14:45:53 np0005477492 chronyd[67954]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  8 14:45:53 np0005477492 chronyd[67954]: Frequency -28.709 +/- 0.114 ppm read from /var/lib/chrony/drift
Oct  8 14:45:53 np0005477492 chronyd[67954]: Loaded seccomp filter (level 2)
Oct  8 14:45:53 np0005477492 systemd[1]: Started NTP client/server.
Oct  8 14:45:54 np0005477492 systemd[1]: session-13.scope: Deactivated successfully.
Oct  8 14:45:54 np0005477492 systemd[1]: session-13.scope: Consumed 28.430s CPU time.
Oct  8 14:45:54 np0005477492 systemd-logind[786]: Session 13 logged out. Waiting for processes to exit.
Oct  8 14:45:54 np0005477492 systemd-logind[786]: Removed session 13.
Oct  8 14:45:59 np0005477492 systemd-logind[786]: New session 14 of user zuul.
Oct  8 14:45:59 np0005477492 systemd[1]: Started Session 14 of User zuul.
Oct  8 14:46:00 np0005477492 python3.9[68134]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:46:01 np0005477492 python3.9[68290]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:02 np0005477492 python3.9[68465]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:03 np0005477492 python3.9[68543]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.z32_3e5z recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:04 np0005477492 python3.9[68695]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:05 np0005477492 python3.9[68818]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949163.6902204-61-27965364481561/.source _original_basename=.29s6ubiu follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:05 np0005477492 python3.9[68970]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:46:06 np0005477492 python3.9[69122]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:07 np0005477492 python3.9[69245]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949166.07117-85-252391981975890/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:46:08 np0005477492 python3.9[69397]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:08 np0005477492 python3.9[69520]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949167.6283605-85-264493363008697/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:46:09 np0005477492 python3.9[69672]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:10 np0005477492 python3.9[69824]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:10 np0005477492 python3.9[69947]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949169.7978468-122-173958538172003/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:11 np0005477492 python3.9[70099]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:12 np0005477492 python3.9[70222]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949171.1871135-137-277081582945966/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:13 np0005477492 python3.9[70374]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:46:13 np0005477492 systemd[1]: Reloading.
Oct  8 14:46:13 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:46:13 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:46:13 np0005477492 systemd[1]: Reloading.
Oct  8 14:46:14 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:46:14 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:46:14 np0005477492 systemd[1]: Starting EDPM Container Shutdown...
Oct  8 14:46:14 np0005477492 systemd[1]: Finished EDPM Container Shutdown.
Oct  8 14:46:14 np0005477492 python3.9[70602]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:15 np0005477492 python3.9[70725]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949174.4304004-160-95052611063579/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:16 np0005477492 python3.9[70877]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:17 np0005477492 python3.9[71000]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949175.812389-175-150319918730696/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:17 np0005477492 python3.9[71152]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:46:17 np0005477492 systemd[1]: Reloading.
Oct  8 14:46:18 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:46:18 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:46:18 np0005477492 systemd[1]: Reloading.
Oct  8 14:46:18 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:46:18 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:46:18 np0005477492 systemd[1]: Starting Create netns directory...
Oct  8 14:46:18 np0005477492 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 14:46:18 np0005477492 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 14:46:18 np0005477492 systemd[1]: Finished Create netns directory.
Oct  8 14:46:19 np0005477492 python3.9[71380]: ansible-ansible.builtin.service_facts Invoked
Oct  8 14:46:19 np0005477492 network[71397]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 14:46:19 np0005477492 network[71398]: 'network-scripts' will be removed from distribution in near future.
Oct  8 14:46:19 np0005477492 network[71399]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 14:46:25 np0005477492 python3.9[71663]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:46:25 np0005477492 systemd[1]: Reloading.
Oct  8 14:46:25 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:46:25 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:46:25 np0005477492 systemd[1]: Stopping IPv4 firewall with iptables...
Oct  8 14:46:26 np0005477492 iptables.init[71704]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct  8 14:46:26 np0005477492 iptables.init[71704]: iptables: Flushing firewall rules: [  OK  ]
Oct  8 14:46:26 np0005477492 systemd[1]: iptables.service: Deactivated successfully.
Oct  8 14:46:26 np0005477492 systemd[1]: Stopped IPv4 firewall with iptables.
Oct  8 14:46:27 np0005477492 python3.9[71900]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:46:28 np0005477492 python3.9[72054]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:46:28 np0005477492 systemd[1]: Reloading.
Oct  8 14:46:28 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:46:28 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:46:28 np0005477492 systemd[1]: Starting Netfilter Tables...
Oct  8 14:46:28 np0005477492 systemd[1]: Finished Netfilter Tables.
Oct  8 14:46:29 np0005477492 python3.9[72246]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:46:30 np0005477492 python3.9[72399]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:46:31 np0005477492 python3.9[72524]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949189.8840718-244-202160130419150/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:46:31 np0005477492 python3.9[72675]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 14:46:57 np0005477492 systemd[1]: session-14.scope: Deactivated successfully.
Oct  8 14:46:57 np0005477492 systemd[1]: session-14.scope: Consumed 22.771s CPU time.
Oct  8 14:46:57 np0005477492 systemd-logind[786]: Session 14 logged out. Waiting for processes to exit.
Oct  8 14:46:57 np0005477492 systemd-logind[786]: Removed session 14.
Oct  8 14:47:09 np0005477492 systemd-logind[786]: New session 15 of user zuul.
Oct  8 14:47:09 np0005477492 systemd[1]: Started Session 15 of User zuul.
Oct  8 14:47:10 np0005477492 python3.9[72869]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:47:12 np0005477492 python3.9[73025]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:13 np0005477492 python3.9[73200]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:13 np0005477492 python3.9[73278]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.vs7jp0k0 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:14 np0005477492 python3.9[73430]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:15 np0005477492 python3.9[73508]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.x9kvqcst recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:16 np0005477492 python3.9[73660]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:47:16 np0005477492 python3.9[73812]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:17 np0005477492 python3.9[73890]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:47:18 np0005477492 python3.9[74042]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:18 np0005477492 python3.9[74120]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 14:47:19 np0005477492 python3.9[74272]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:20 np0005477492 python3.9[74424]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:20 np0005477492 python3.9[74502]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:21 np0005477492 python3.9[74654]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:21 np0005477492 python3.9[74732]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:23 np0005477492 python3.9[74884]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:47:23 np0005477492 systemd[1]: Reloading.
Oct  8 14:47:23 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:47:23 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:47:24 np0005477492 python3.9[75073]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:24 np0005477492 python3.9[75151]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:25 np0005477492 python3.9[75303]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:26 np0005477492 python3.9[75381]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:27 np0005477492 python3.9[75533]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 14:47:27 np0005477492 systemd[1]: Reloading.
Oct  8 14:47:27 np0005477492 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 14:47:27 np0005477492 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 14:47:27 np0005477492 systemd[1]: Starting Create netns directory...
Oct  8 14:47:27 np0005477492 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 14:47:27 np0005477492 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 14:47:27 np0005477492 systemd[1]: Finished Create netns directory.
Oct  8 14:47:28 np0005477492 python3.9[75726]: ansible-ansible.builtin.service_facts Invoked
Oct  8 14:47:28 np0005477492 network[75743]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 14:47:28 np0005477492 network[75744]: 'network-scripts' will be removed from distribution in near future.
Oct  8 14:47:28 np0005477492 network[75745]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 14:47:32 np0005477492 python3.9[76008]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:33 np0005477492 python3.9[76086]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:33 np0005477492 python3.9[76238]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:34 np0005477492 python3.9[76390]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:35 np0005477492 python3.9[76513]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949254.2026577-216-156376028984193/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:36 np0005477492 python3.9[76665]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  8 14:47:36 np0005477492 systemd[1]: Starting Time & Date Service...
Oct  8 14:47:36 np0005477492 systemd[1]: Started Time & Date Service.
Oct  8 14:47:37 np0005477492 python3.9[76821]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:38 np0005477492 python3.9[76973]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:38 np0005477492 python3.9[77096]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949257.6909366-251-140599022604894/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:39 np0005477492 python3.9[77248]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:40 np0005477492 python3.9[77371]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949259.0690176-266-199490047114041/.source.yaml _original_basename=.0ti04kl_ follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:41 np0005477492 python3.9[77523]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:41 np0005477492 python3.9[77646]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949260.5276635-281-224859374963213/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:42 np0005477492 python3.9[77798]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:47:43 np0005477492 python3.9[77951]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:47:44 np0005477492 python3[78104]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  8 14:47:45 np0005477492 python3.9[78256]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:45 np0005477492 python3.9[78379]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949264.5114021-320-3757468103690/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:46 np0005477492 python3.9[78531]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:47 np0005477492 python3.9[78654]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949265.9579628-335-147311120881576/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:47 np0005477492 python3.9[78806]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:48 np0005477492 python3.9[78929]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949267.449393-350-64196000175449/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:49 np0005477492 python3.9[79081]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:50 np0005477492 python3.9[79204]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949268.828244-365-244731069020195/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:50 np0005477492 python3.9[79356]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 14:47:51 np0005477492 python3.9[79479]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949270.239594-380-28873690016293/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:52 np0005477492 python3.9[79631]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:53 np0005477492 python3.9[79783]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:47:54 np0005477492 python3.9[79942]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:54 np0005477492 python3.9[80095]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:55 np0005477492 python3.9[80247]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:47:56 np0005477492 python3.9[80399]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  8 14:47:56 np0005477492 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 14:47:56 np0005477492 rsyslogd[1004]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 14:47:57 np0005477492 python3.9[80553]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  8 14:47:57 np0005477492 systemd[1]: session-15.scope: Deactivated successfully.
Oct  8 14:47:57 np0005477492 systemd[1]: session-15.scope: Consumed 37.359s CPU time.
Oct  8 14:47:57 np0005477492 systemd-logind[786]: Session 15 logged out. Waiting for processes to exit.
Oct  8 14:47:57 np0005477492 systemd-logind[786]: Removed session 15.
Oct  8 14:48:02 np0005477492 systemd-logind[786]: New session 16 of user zuul.
Oct  8 14:48:02 np0005477492 systemd[1]: Started Session 16 of User zuul.
Oct  8 14:48:03 np0005477492 python3.9[80736]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct  8 14:48:03 np0005477492 chronyd[67954]: Selected source 45.61.49.156 (pool.ntp.org)
Oct  8 14:48:04 np0005477492 python3.9[80888]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:48:05 np0005477492 python3.9[81040]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:48:06 np0005477492 python3.9[81192]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDVe0/FNjTVCj9UBdVEox0TsBm6l6BSx1bFgTZ7SPyRKER7nryaTY1scYOlMQuARRGVFBMQl1t3JbSCPI1W5dO3TmJiTUmZlgpnm3jWPCxYe5fh+AW7s5RiQlwiIVdrYYCDd6c+ycIeyLKFC68oI7a6dNGy9lG9IZnoT6oYXjT0AHHTjRejF83VhrZRHxTRVscvBsfTTaC/MLYJHWzvc3PQBKjEs23sdNj07oqRetAYJs933kiILU6nwpxfq/b4l/n5nETJiCT7c0W+iqI1zGEnbUbct61Nj4fGHyU33nj+oM1AWeFLux7Q0HiVTQTPIKRSzkndu/ZE3exNqcUcNPrl8vrLDVuB0F7aNlCLOJoA6VLKOaVP/JeT5hoFgZ/Erkvxz/WJqIqOCzeatDdwNT+iAaqnFKgnLyTNrdQqV6wphVKlKnkC/hhpc9mJ7/8mySXs3himoocAguUYsbN+F+FsRgrUlnvhTLMeSvYnPQaj+6XsOLmHNsJpJIj+pgLdgZ0=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIG343WCQyWX42xVSEQmRdbLMiSgH5Ycblsc8HSC8Au6/#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIxYWoMLXSaMBKBKjHv7e2sClnJ5Dqw1nz8nfwzB2XSUuk0oyUSmLoCzWyTmdTLiZBHqZ5z07Umm+y7TTU3AcKs=#012 create=True mode=0644 path=/tmp/ansible.589fiu56 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:48:06 np0005477492 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  8 14:48:07 np0005477492 python3.9[81346]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.589fiu56' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:48:08 np0005477492 python3.9[81500]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.589fiu56 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:48:08 np0005477492 systemd[1]: session-16.scope: Deactivated successfully.
Oct  8 14:48:08 np0005477492 systemd[1]: session-16.scope: Consumed 3.837s CPU time.
Oct  8 14:48:08 np0005477492 systemd-logind[786]: Session 16 logged out. Waiting for processes to exit.
Oct  8 14:48:08 np0005477492 systemd-logind[786]: Removed session 16.
Oct  8 14:48:13 np0005477492 systemd-logind[786]: New session 17 of user zuul.
Oct  8 14:48:13 np0005477492 systemd[1]: Started Session 17 of User zuul.
Oct  8 14:48:15 np0005477492 python3.9[81678]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:48:16 np0005477492 python3.9[81834]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  8 14:48:17 np0005477492 python3.9[81988]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 14:48:18 np0005477492 python3.9[82141]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:48:19 np0005477492 python3.9[82294]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:48:20 np0005477492 python3.9[82448]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:48:21 np0005477492 python3.9[82603]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:48:21 np0005477492 systemd[1]: session-17.scope: Deactivated successfully.
Oct  8 14:48:21 np0005477492 systemd[1]: session-17.scope: Consumed 5.427s CPU time.
Oct  8 14:48:21 np0005477492 systemd-logind[786]: Session 17 logged out. Waiting for processes to exit.
Oct  8 14:48:21 np0005477492 systemd-logind[786]: Removed session 17.
Oct  8 14:48:27 np0005477492 systemd-logind[786]: New session 18 of user zuul.
Oct  8 14:48:27 np0005477492 systemd[1]: Started Session 18 of User zuul.
Oct  8 14:48:28 np0005477492 python3.9[82781]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:48:29 np0005477492 python3.9[82937]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 14:48:30 np0005477492 python3.9[83021]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  8 14:48:32 np0005477492 python3.9[83172]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 14:48:33 np0005477492 python3.9[83325]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:48:34 np0005477492 python3.9[83477]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:48:35 np0005477492 python3.9[83629]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Core libraries or services have been updated since boot-up:#012  * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 14:48:36 np0005477492 python3.9[83779]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 14:48:37 np0005477492 python3.9[83929]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:48:37 np0005477492 python3.9[84079]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 14:48:38 np0005477492 python3.9[84231]: ansible-ansible.legacy.setup Invoked with gather_subset=['min'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 14:48:39 np0005477492 python3.9[84344]: ansible-ansible.legacy.find Invoked with paths=['/sbin', '/bin', '/usr/sbin', '/usr/bin', '/usr/local/sbin'] patterns=['shutdown'] file_type=any read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 18:48:51 compute-0 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct  8 18:48:51 compute-0 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 18:48:51 compute-0 kernel: BIOS-provided physical RAM map:
Oct  8 18:48:51 compute-0 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct  8 18:48:51 compute-0 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct  8 18:48:51 compute-0 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct  8 18:48:51 compute-0 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct  8 18:48:51 compute-0 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct  8 18:48:51 compute-0 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct  8 18:48:51 compute-0 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct  8 18:48:51 compute-0 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct  8 18:48:51 compute-0 kernel: NX (Execute Disable) protection: active
Oct  8 18:48:51 compute-0 kernel: APIC: Static calls initialized
Oct  8 18:48:51 compute-0 kernel: SMBIOS 2.8 present.
Oct  8 18:48:51 compute-0 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct  8 18:48:51 compute-0 kernel: Hypervisor detected: KVM
Oct  8 18:48:51 compute-0 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct  8 18:48:51 compute-0 kernel: kvm-clock: using sched offset of 7394704554048 cycles
Oct  8 18:48:51 compute-0 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct  8 18:48:51 compute-0 kernel: tsc: Detected 2800.000 MHz processor
Oct  8 18:48:51 compute-0 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct  8 18:48:51 compute-0 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct  8 18:48:51 compute-0 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct  8 18:48:51 compute-0 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct  8 18:48:51 compute-0 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct  8 18:48:51 compute-0 kernel: Using GB pages for direct mapping
Oct  8 18:48:51 compute-0 kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct  8 18:48:51 compute-0 kernel: ACPI: Early table checksum verification disabled
Oct  8 18:48:51 compute-0 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct  8 18:48:51 compute-0 kernel: ACPI: RSDT 0x00000000BFFE16C4 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 18:48:51 compute-0 kernel: ACPI: FACP 0x00000000BFFE1578 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 18:48:51 compute-0 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F8 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 18:48:51 compute-0 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct  8 18:48:51 compute-0 kernel: ACPI: APIC 0x00000000BFFE15EC 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 18:48:51 compute-0 kernel: ACPI: WAET 0x00000000BFFE169C 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  8 18:48:51 compute-0 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1578-0xbffe15eb]
Oct  8 18:48:51 compute-0 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1577]
Oct  8 18:48:51 compute-0 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct  8 18:48:51 compute-0 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15ec-0xbffe169b]
Oct  8 18:48:51 compute-0 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe169c-0xbffe16c3]
Oct  8 18:48:51 compute-0 kernel: No NUMA configuration found
Oct  8 18:48:51 compute-0 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct  8 18:48:51 compute-0 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Oct  8 18:48:51 compute-0 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Oct  8 18:48:51 compute-0 kernel: Zone ranges:
Oct  8 18:48:51 compute-0 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct  8 18:48:51 compute-0 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct  8 18:48:51 compute-0 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct  8 18:48:51 compute-0 kernel:  Device   empty
Oct  8 18:48:51 compute-0 kernel: Movable zone start for each node
Oct  8 18:48:51 compute-0 kernel: Early memory node ranges
Oct  8 18:48:51 compute-0 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct  8 18:48:51 compute-0 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct  8 18:48:51 compute-0 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct  8 18:48:51 compute-0 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct  8 18:48:51 compute-0 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct  8 18:48:51 compute-0 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct  8 18:48:51 compute-0 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct  8 18:48:51 compute-0 kernel: ACPI: PM-Timer IO Port: 0x608
Oct  8 18:48:51 compute-0 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct  8 18:48:51 compute-0 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct  8 18:48:51 compute-0 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct  8 18:48:51 compute-0 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct  8 18:48:51 compute-0 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct  8 18:48:51 compute-0 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct  8 18:48:51 compute-0 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct  8 18:48:51 compute-0 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct  8 18:48:51 compute-0 kernel: TSC deadline timer available
Oct  8 18:48:51 compute-0 kernel: CPU topo: Max. logical packages:   8
Oct  8 18:48:51 compute-0 kernel: CPU topo: Max. logical dies:       8
Oct  8 18:48:51 compute-0 kernel: CPU topo: Max. dies per package:   1
Oct  8 18:48:51 compute-0 kernel: CPU topo: Max. threads per core:   1
Oct  8 18:48:51 compute-0 kernel: CPU topo: Num. cores per package:     1
Oct  8 18:48:51 compute-0 kernel: CPU topo: Num. threads per package:   1
Oct  8 18:48:51 compute-0 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct  8 18:48:51 compute-0 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct  8 18:48:51 compute-0 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct  8 18:48:51 compute-0 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct  8 18:48:51 compute-0 kernel: Booting paravirtualized kernel on KVM
Oct  8 18:48:51 compute-0 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct  8 18:48:51 compute-0 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct  8 18:48:51 compute-0 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct  8 18:48:51 compute-0 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct  8 18:48:51 compute-0 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 18:48:51 compute-0 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct  8 18:48:51 compute-0 kernel: random: crng init done
Oct  8 18:48:51 compute-0 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: Fallback order for Node 0: 0 
Oct  8 18:48:51 compute-0 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct  8 18:48:51 compute-0 kernel: Policy zone: Normal
Oct  8 18:48:51 compute-0 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct  8 18:48:51 compute-0 kernel: software IO TLB: area num 8.
Oct  8 18:48:51 compute-0 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct  8 18:48:51 compute-0 kernel: ftrace: allocating 49370 entries in 193 pages
Oct  8 18:48:51 compute-0 kernel: ftrace: allocated 193 pages with 3 groups
Oct  8 18:48:51 compute-0 kernel: Dynamic Preempt: voluntary
Oct  8 18:48:51 compute-0 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct  8 18:48:51 compute-0 kernel: rcu: #011RCU event tracing is enabled.
Oct  8 18:48:51 compute-0 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct  8 18:48:51 compute-0 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct  8 18:48:51 compute-0 kernel: #011Rude variant of Tasks RCU enabled.
Oct  8 18:48:51 compute-0 kernel: #011Tracing variant of Tasks RCU enabled.
Oct  8 18:48:51 compute-0 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct  8 18:48:51 compute-0 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct  8 18:48:51 compute-0 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 18:48:51 compute-0 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 18:48:51 compute-0 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  8 18:48:51 compute-0 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct  8 18:48:51 compute-0 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct  8 18:48:51 compute-0 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct  8 18:48:51 compute-0 kernel: Console: colour VGA+ 80x25
Oct  8 18:48:51 compute-0 kernel: printk: console [ttyS0] enabled
Oct  8 18:48:51 compute-0 kernel: ACPI: Core revision 20230331
Oct  8 18:48:51 compute-0 kernel: APIC: Switch to symmetric I/O mode setup
Oct  8 18:48:51 compute-0 kernel: x2apic enabled
Oct  8 18:48:51 compute-0 kernel: APIC: Switched APIC routing to: physical x2apic
Oct  8 18:48:51 compute-0 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct  8 18:48:51 compute-0 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct  8 18:48:51 compute-0 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct  8 18:48:51 compute-0 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct  8 18:48:51 compute-0 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct  8 18:48:51 compute-0 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct  8 18:48:51 compute-0 kernel: Spectre V2 : Mitigation: Retpolines
Oct  8 18:48:51 compute-0 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct  8 18:48:51 compute-0 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct  8 18:48:51 compute-0 kernel: RETBleed: Mitigation: untrained return thunk
Oct  8 18:48:51 compute-0 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct  8 18:48:51 compute-0 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct  8 18:48:51 compute-0 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct  8 18:48:51 compute-0 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct  8 18:48:51 compute-0 kernel: x86/bugs: return thunk changed
Oct  8 18:48:51 compute-0 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct  8 18:48:51 compute-0 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct  8 18:48:51 compute-0 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct  8 18:48:51 compute-0 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct  8 18:48:51 compute-0 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct  8 18:48:51 compute-0 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct  8 18:48:51 compute-0 kernel: Freeing SMP alternatives memory: 40K
Oct  8 18:48:51 compute-0 kernel: pid_max: default: 32768 minimum: 301
Oct  8 18:48:51 compute-0 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct  8 18:48:51 compute-0 kernel: landlock: Up and running.
Oct  8 18:48:51 compute-0 kernel: Yama: becoming mindful.
Oct  8 18:48:51 compute-0 kernel: SELinux:  Initializing.
Oct  8 18:48:51 compute-0 kernel: LSM support for eBPF active
Oct  8 18:48:51 compute-0 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct  8 18:48:51 compute-0 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct  8 18:48:51 compute-0 kernel: ... version:                0
Oct  8 18:48:51 compute-0 kernel: ... bit width:              48
Oct  8 18:48:51 compute-0 kernel: ... generic registers:      6
Oct  8 18:48:51 compute-0 kernel: ... value mask:             0000ffffffffffff
Oct  8 18:48:51 compute-0 kernel: ... max period:             00007fffffffffff
Oct  8 18:48:51 compute-0 kernel: ... fixed-purpose events:   0
Oct  8 18:48:51 compute-0 kernel: ... event mask:             000000000000003f
Oct  8 18:48:51 compute-0 kernel: signal: max sigframe size: 1776
Oct  8 18:48:51 compute-0 kernel: rcu: Hierarchical SRCU implementation.
Oct  8 18:48:51 compute-0 kernel: rcu: #011Max phase no-delay instances is 400.
Oct  8 18:48:51 compute-0 kernel: smp: Bringing up secondary CPUs ...
Oct  8 18:48:51 compute-0 kernel: smpboot: x86: Booting SMP configuration:
Oct  8 18:48:51 compute-0 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct  8 18:48:51 compute-0 kernel: smp: Brought up 1 node, 8 CPUs
Oct  8 18:48:51 compute-0 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct  8 18:48:51 compute-0 kernel: node 0 deferred pages initialised in 32ms
Oct  8 18:48:51 compute-0 kernel: Memory: 7765660K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616504K reserved, 0K cma-reserved)
Oct  8 18:48:51 compute-0 kernel: devtmpfs: initialized
Oct  8 18:48:51 compute-0 kernel: x86/mm: Memory block size: 128MB
Oct  8 18:48:51 compute-0 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct  8 18:48:51 compute-0 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: pinctrl core: initialized pinctrl subsystem
Oct  8 18:48:51 compute-0 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct  8 18:48:51 compute-0 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct  8 18:48:51 compute-0 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct  8 18:48:51 compute-0 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct  8 18:48:51 compute-0 kernel: audit: initializing netlink subsys (disabled)
Oct  8 18:48:51 compute-0 kernel: audit: type=2000 audit(1759949329.395:1): state=initialized audit_enabled=0 res=1
Oct  8 18:48:51 compute-0 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct  8 18:48:51 compute-0 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct  8 18:48:51 compute-0 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct  8 18:48:51 compute-0 kernel: cpuidle: using governor menu
Oct  8 18:48:51 compute-0 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct  8 18:48:51 compute-0 kernel: PCI: Using configuration type 1 for base access
Oct  8 18:48:51 compute-0 kernel: PCI: Using configuration type 1 for extended access
Oct  8 18:48:51 compute-0 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct  8 18:48:51 compute-0 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct  8 18:48:51 compute-0 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct  8 18:48:51 compute-0 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct  8 18:48:51 compute-0 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct  8 18:48:51 compute-0 kernel: Demotion targets for Node 0: null
Oct  8 18:48:51 compute-0 kernel: cryptd: max_cpu_qlen set to 1000
Oct  8 18:48:51 compute-0 kernel: ACPI: Added _OSI(Module Device)
Oct  8 18:48:51 compute-0 kernel: ACPI: Added _OSI(Processor Device)
Oct  8 18:48:51 compute-0 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct  8 18:48:51 compute-0 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct  8 18:48:51 compute-0 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct  8 18:48:51 compute-0 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct  8 18:48:51 compute-0 kernel: ACPI: Interpreter enabled
Oct  8 18:48:51 compute-0 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct  8 18:48:51 compute-0 kernel: ACPI: Using IOAPIC for interrupt routing
Oct  8 18:48:51 compute-0 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct  8 18:48:51 compute-0 kernel: PCI: Using E820 reservations for host bridge windows
Oct  8 18:48:51 compute-0 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct  8 18:48:51 compute-0 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct  8 18:48:51 compute-0 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [3] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [4] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [5] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [6] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [7] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [8] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [9] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [10] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [11] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [12] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [13] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [14] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [15] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [16] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [17] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [18] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [19] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [20] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [21] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [22] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [23] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [24] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [25] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [26] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [27] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [28] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [29] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [30] registered
Oct  8 18:48:51 compute-0 kernel: acpiphp: Slot [31] registered
Oct  8 18:48:51 compute-0 kernel: PCI host bridge to bus 0000:00
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.1: BAR 4 [io  0xc180-0xc18f]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.2: BAR 4 [io  0xc140-0xc15f]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:03.0: ROM [mem 0xfea80000-0xfeafffff pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:06.0: BAR 0 [io  0xc160-0xc17f]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:07.0: BAR 0 [io  0xc100-0xc13f]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:07.0: BAR 1 [mem 0xfeb93000-0xfeb93fff]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:07.0: BAR 4 [mem 0xfe814000-0xfe817fff 64bit pref]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:07.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct  8 18:48:51 compute-0 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct  8 18:48:51 compute-0 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct  8 18:48:51 compute-0 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct  8 18:48:51 compute-0 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct  8 18:48:51 compute-0 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct  8 18:48:51 compute-0 kernel: iommu: Default domain type: Translated
Oct  8 18:48:51 compute-0 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct  8 18:48:51 compute-0 kernel: SCSI subsystem initialized
Oct  8 18:48:51 compute-0 kernel: ACPI: bus type USB registered
Oct  8 18:48:51 compute-0 kernel: usbcore: registered new interface driver usbfs
Oct  8 18:48:51 compute-0 kernel: usbcore: registered new interface driver hub
Oct  8 18:48:51 compute-0 kernel: usbcore: registered new device driver usb
Oct  8 18:48:51 compute-0 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct  8 18:48:51 compute-0 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct  8 18:48:51 compute-0 kernel: PTP clock support registered
Oct  8 18:48:51 compute-0 kernel: EDAC MC: Ver: 3.0.0
Oct  8 18:48:51 compute-0 kernel: NetLabel: Initializing
Oct  8 18:48:51 compute-0 kernel: NetLabel:  domain hash size = 128
Oct  8 18:48:51 compute-0 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct  8 18:48:51 compute-0 kernel: NetLabel:  unlabeled traffic allowed by default
Oct  8 18:48:51 compute-0 kernel: PCI: Using ACPI for IRQ routing
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct  8 18:48:51 compute-0 kernel: vgaarb: loaded
Oct  8 18:48:51 compute-0 kernel: clocksource: Switched to clocksource kvm-clock
Oct  8 18:48:51 compute-0 kernel: VFS: Disk quotas dquot_6.6.0
Oct  8 18:48:51 compute-0 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct  8 18:48:51 compute-0 kernel: pnp: PnP ACPI init
Oct  8 18:48:51 compute-0 kernel: pnp: PnP ACPI: found 5 devices
Oct  8 18:48:51 compute-0 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct  8 18:48:51 compute-0 kernel: NET: Registered PF_INET protocol family
Oct  8 18:48:51 compute-0 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct  8 18:48:51 compute-0 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  8 18:48:51 compute-0 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct  8 18:48:51 compute-0 kernel: NET: Registered PF_XDP protocol family
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct  8 18:48:51 compute-0 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct  8 18:48:51 compute-0 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct  8 18:48:51 compute-0 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 72283 usecs
Oct  8 18:48:51 compute-0 kernel: PCI: CLS 0 bytes, default 64
Oct  8 18:48:51 compute-0 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct  8 18:48:51 compute-0 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Oct  8 18:48:51 compute-0 kernel: Trying to unpack rootfs image as initramfs...
Oct  8 18:48:51 compute-0 kernel: ACPI: bus type thunderbolt registered
Oct  8 18:48:51 compute-0 kernel: Initialise system trusted keyrings
Oct  8 18:48:51 compute-0 kernel: Key type blacklist registered
Oct  8 18:48:51 compute-0 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct  8 18:48:51 compute-0 kernel: zbud: loaded
Oct  8 18:48:51 compute-0 kernel: integrity: Platform Keyring initialized
Oct  8 18:48:51 compute-0 kernel: integrity: Machine keyring initialized
Oct  8 18:48:51 compute-0 kernel: Freeing initrd memory: 86104K
Oct  8 18:48:51 compute-0 kernel: NET: Registered PF_ALG protocol family
Oct  8 18:48:51 compute-0 kernel: xor: automatically using best checksumming function   avx       
Oct  8 18:48:51 compute-0 kernel: Key type asymmetric registered
Oct  8 18:48:51 compute-0 kernel: Asymmetric key parser 'x509' registered
Oct  8 18:48:51 compute-0 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct  8 18:48:51 compute-0 kernel: io scheduler mq-deadline registered
Oct  8 18:48:51 compute-0 kernel: io scheduler kyber registered
Oct  8 18:48:51 compute-0 kernel: io scheduler bfq registered
Oct  8 18:48:51 compute-0 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct  8 18:48:51 compute-0 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct  8 18:48:51 compute-0 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct  8 18:48:51 compute-0 kernel: ACPI: button: Power Button [PWRF]
Oct  8 18:48:51 compute-0 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct  8 18:48:51 compute-0 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct  8 18:48:51 compute-0 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct  8 18:48:51 compute-0 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct  8 18:48:51 compute-0 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct  8 18:48:51 compute-0 kernel: Non-volatile memory driver v1.3
Oct  8 18:48:51 compute-0 kernel: rdac: device handler registered
Oct  8 18:48:51 compute-0 kernel: hp_sw: device handler registered
Oct  8 18:48:51 compute-0 kernel: emc: device handler registered
Oct  8 18:48:51 compute-0 kernel: alua: device handler registered
Oct  8 18:48:51 compute-0 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct  8 18:48:51 compute-0 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct  8 18:48:51 compute-0 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct  8 18:48:51 compute-0 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c140
Oct  8 18:48:51 compute-0 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct  8 18:48:51 compute-0 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct  8 18:48:51 compute-0 kernel: usb usb1: Product: UHCI Host Controller
Oct  8 18:48:51 compute-0 kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct  8 18:48:51 compute-0 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct  8 18:48:51 compute-0 kernel: hub 1-0:1.0: USB hub found
Oct  8 18:48:51 compute-0 kernel: hub 1-0:1.0: 2 ports detected
Oct  8 18:48:51 compute-0 kernel: usbcore: registered new interface driver usbserial_generic
Oct  8 18:48:51 compute-0 kernel: usbserial: USB Serial support registered for generic
Oct  8 18:48:51 compute-0 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct  8 18:48:51 compute-0 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct  8 18:48:51 compute-0 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct  8 18:48:51 compute-0 kernel: mousedev: PS/2 mouse device common for all mice
Oct  8 18:48:51 compute-0 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct  8 18:48:51 compute-0 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct  8 18:48:51 compute-0 kernel: rtc_cmos 00:04: registered as rtc0
Oct  8 18:48:51 compute-0 kernel: rtc_cmos 00:04: setting system clock to 2025-10-08T18:48:50 UTC (1759949330)
Oct  8 18:48:51 compute-0 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct  8 18:48:51 compute-0 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct  8 18:48:51 compute-0 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct  8 18:48:51 compute-0 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct  8 18:48:51 compute-0 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct  8 18:48:51 compute-0 kernel: usbcore: registered new interface driver usbhid
Oct  8 18:48:51 compute-0 kernel: usbhid: USB HID core driver
Oct  8 18:48:51 compute-0 kernel: drop_monitor: Initializing network drop monitor service
Oct  8 18:48:51 compute-0 kernel: Initializing XFRM netlink socket
Oct  8 18:48:51 compute-0 kernel: NET: Registered PF_INET6 protocol family
Oct  8 18:48:51 compute-0 kernel: Segment Routing with IPv6
Oct  8 18:48:51 compute-0 kernel: NET: Registered PF_PACKET protocol family
Oct  8 18:48:51 compute-0 kernel: mpls_gso: MPLS GSO support
Oct  8 18:48:51 compute-0 kernel: IPI shorthand broadcast: enabled
Oct  8 18:48:51 compute-0 kernel: AVX2 version of gcm_enc/dec engaged.
Oct  8 18:48:51 compute-0 kernel: AES CTR mode by8 optimization enabled
Oct  8 18:48:51 compute-0 kernel: sched_clock: Marking stable (1189003860, 145107150)->(1442805359, -108694349)
Oct  8 18:48:51 compute-0 kernel: registered taskstats version 1
Oct  8 18:48:51 compute-0 kernel: Loading compiled-in X.509 certificates
Oct  8 18:48:51 compute-0 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  8 18:48:51 compute-0 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct  8 18:48:51 compute-0 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct  8 18:48:51 compute-0 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct  8 18:48:51 compute-0 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct  8 18:48:51 compute-0 kernel: Demotion targets for Node 0: null
Oct  8 18:48:51 compute-0 kernel: page_owner is disabled
Oct  8 18:48:51 compute-0 kernel: Key type .fscrypt registered
Oct  8 18:48:51 compute-0 kernel: Key type fscrypt-provisioning registered
Oct  8 18:48:51 compute-0 kernel: Key type big_key registered
Oct  8 18:48:51 compute-0 kernel: Key type encrypted registered
Oct  8 18:48:51 compute-0 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct  8 18:48:51 compute-0 kernel: Loading compiled-in module X.509 certificates
Oct  8 18:48:51 compute-0 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  8 18:48:51 compute-0 kernel: ima: Allocated hash algorithm: sha256
Oct  8 18:48:51 compute-0 kernel: ima: No architecture policies found
Oct  8 18:48:51 compute-0 kernel: evm: Initialising EVM extended attributes:
Oct  8 18:48:51 compute-0 kernel: evm: security.selinux
Oct  8 18:48:51 compute-0 kernel: evm: security.SMACK64 (disabled)
Oct  8 18:48:51 compute-0 kernel: evm: security.SMACK64EXEC (disabled)
Oct  8 18:48:51 compute-0 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct  8 18:48:51 compute-0 kernel: evm: security.SMACK64MMAP (disabled)
Oct  8 18:48:51 compute-0 kernel: evm: security.apparmor (disabled)
Oct  8 18:48:51 compute-0 kernel: evm: security.ima
Oct  8 18:48:51 compute-0 kernel: evm: security.capability
Oct  8 18:48:51 compute-0 kernel: evm: HMAC attrs: 0x1
Oct  8 18:48:51 compute-0 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct  8 18:48:51 compute-0 kernel: Running certificate verification RSA selftest
Oct  8 18:48:51 compute-0 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct  8 18:48:51 compute-0 kernel: Running certificate verification ECDSA selftest
Oct  8 18:48:51 compute-0 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct  8 18:48:51 compute-0 kernel: clk: Disabling unused clocks
Oct  8 18:48:51 compute-0 kernel: Freeing unused decrypted memory: 2028K
Oct  8 18:48:51 compute-0 kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct  8 18:48:51 compute-0 kernel: Write protecting the kernel read-only data: 30720k
Oct  8 18:48:51 compute-0 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct  8 18:48:51 compute-0 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct  8 18:48:51 compute-0 kernel: usb 1-1: Product: QEMU USB Tablet
Oct  8 18:48:51 compute-0 kernel: usb 1-1: Manufacturer: QEMU
Oct  8 18:48:51 compute-0 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct  8 18:48:51 compute-0 kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct  8 18:48:51 compute-0 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct  8 18:48:51 compute-0 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct  8 18:48:51 compute-0 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct  8 18:48:51 compute-0 kernel: Run /init as init process
Oct  8 18:48:51 compute-0 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  8 18:48:51 compute-0 systemd: Detected virtualization kvm.
Oct  8 18:48:51 compute-0 systemd: Detected architecture x86-64.
Oct  8 18:48:51 compute-0 systemd: Running in initrd.
Oct  8 18:48:51 compute-0 systemd: No hostname configured, using default hostname.
Oct  8 18:48:51 compute-0 systemd: Hostname set to <localhost>.
Oct  8 18:48:51 compute-0 systemd: Initializing machine ID from VM UUID.
Oct  8 18:48:51 compute-0 systemd: Queued start job for default target Initrd Default Target.
Oct  8 18:48:51 compute-0 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  8 18:48:51 compute-0 systemd: Reached target Local Encrypted Volumes.
Oct  8 18:48:51 compute-0 systemd: Reached target Initrd /usr File System.
Oct  8 18:48:51 compute-0 systemd: Reached target Local File Systems.
Oct  8 18:48:51 compute-0 systemd: Reached target Path Units.
Oct  8 18:48:51 compute-0 systemd: Reached target Slice Units.
Oct  8 18:48:51 compute-0 systemd: Reached target Swaps.
Oct  8 18:48:51 compute-0 systemd: Reached target Timer Units.
Oct  8 18:48:51 compute-0 systemd: Listening on D-Bus System Message Bus Socket.
Oct  8 18:48:51 compute-0 systemd: Listening on Journal Socket (/dev/log).
Oct  8 18:48:51 compute-0 systemd: Listening on Journal Socket.
Oct  8 18:48:51 compute-0 systemd: Listening on udev Control Socket.
Oct  8 18:48:51 compute-0 systemd: Listening on udev Kernel Socket.
Oct  8 18:48:51 compute-0 systemd: Reached target Socket Units.
Oct  8 18:48:51 compute-0 systemd: Starting Create List of Static Device Nodes...
Oct  8 18:48:51 compute-0 systemd: Starting Journal Service...
Oct  8 18:48:51 compute-0 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  8 18:48:51 compute-0 systemd: Starting Apply Kernel Variables...
Oct  8 18:48:51 compute-0 systemd: Starting Create System Users...
Oct  8 18:48:51 compute-0 systemd: Starting Setup Virtual Console...
Oct  8 18:48:51 compute-0 systemd: Finished Create List of Static Device Nodes.
Oct  8 18:48:51 compute-0 systemd: Finished Apply Kernel Variables.
Oct  8 18:48:51 compute-0 systemd: Finished Create System Users.
Oct  8 18:48:51 compute-0 systemd: Starting Create Static Device Nodes in /dev...
Oct  8 18:48:51 compute-0 systemd-journald[305]: Journal started
Oct  8 18:48:51 compute-0 systemd-journald[305]: Runtime Journal (/run/log/journal/9ff32318d7e04b37bb6eea4cfd795672) is 8.0M, max 153.5M, 145.5M free.
Oct  8 18:48:51 compute-0 systemd-sysusers[308]: Creating group 'users' with GID 100.
Oct  8 18:48:51 compute-0 systemd-sysusers[308]: Creating group 'dbus' with GID 81.
Oct  8 18:48:51 compute-0 systemd-sysusers[308]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct  8 18:48:51 compute-0 systemd: Started Journal Service.
Oct  8 18:48:51 compute-0 systemd[1]: Starting Create Volatile Files and Directories...
Oct  8 18:48:51 compute-0 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  8 18:48:51 compute-0 systemd[1]: Finished Create Volatile Files and Directories.
Oct  8 18:48:51 compute-0 systemd[1]: Finished Setup Virtual Console.
Oct  8 18:48:51 compute-0 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct  8 18:48:51 compute-0 systemd[1]: Starting dracut cmdline hook...
Oct  8 18:48:51 compute-0 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Oct  8 18:48:51 compute-0 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  8 18:48:51 compute-0 systemd[1]: Finished dracut cmdline hook.
Oct  8 18:48:51 compute-0 systemd[1]: Starting dracut pre-udev hook...
Oct  8 18:48:51 compute-0 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct  8 18:48:51 compute-0 kernel: device-mapper: uevent: version 1.0.3
Oct  8 18:48:51 compute-0 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct  8 18:48:51 compute-0 kernel: RPC: Registered named UNIX socket transport module.
Oct  8 18:48:51 compute-0 kernel: RPC: Registered udp transport module.
Oct  8 18:48:51 compute-0 kernel: RPC: Registered tcp transport module.
Oct  8 18:48:51 compute-0 kernel: RPC: Registered tcp-with-tls transport module.
Oct  8 18:48:51 compute-0 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct  8 18:48:51 compute-0 rpc.statd[443]: Version 2.5.4 starting
Oct  8 18:48:51 compute-0 rpc.statd[443]: Initializing NSM state
Oct  8 18:48:52 compute-0 rpc.idmapd[448]: Setting log level to 0
Oct  8 18:48:52 compute-0 systemd[1]: Finished dracut pre-udev hook.
Oct  8 18:48:52 compute-0 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  8 18:48:52 compute-0 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Oct  8 18:48:52 compute-0 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  8 18:48:52 compute-0 systemd[1]: Starting dracut pre-trigger hook...
Oct  8 18:48:52 compute-0 systemd[1]: Finished dracut pre-trigger hook.
Oct  8 18:48:52 compute-0 systemd[1]: Starting Coldplug All udev Devices...
Oct  8 18:48:52 compute-0 systemd[1]: Created slice Slice /system/modprobe.
Oct  8 18:48:52 compute-0 systemd[1]: Starting Load Kernel Module configfs...
Oct  8 18:48:52 compute-0 systemd[1]: Finished Coldplug All udev Devices.
Oct  8 18:48:52 compute-0 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  8 18:48:52 compute-0 systemd[1]: Finished Load Kernel Module configfs.
Oct  8 18:48:52 compute-0 systemd[1]: Mounting Kernel Configuration File System...
Oct  8 18:48:52 compute-0 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  8 18:48:52 compute-0 systemd[1]: Reached target Network.
Oct  8 18:48:52 compute-0 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  8 18:48:52 compute-0 systemd[1]: Starting dracut initqueue hook...
Oct  8 18:48:52 compute-0 systemd[1]: Mounted Kernel Configuration File System.
Oct  8 18:48:52 compute-0 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct  8 18:48:52 compute-0 systemd[1]: Reached target System Initialization.
Oct  8 18:48:52 compute-0 systemd[1]: Reached target Basic System.
Oct  8 18:48:52 compute-0 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct  8 18:48:52 compute-0 kernel: vda: vda1
Oct  8 18:48:52 compute-0 kernel: scsi host0: ata_piix
Oct  8 18:48:52 compute-0 kernel: scsi host1: ata_piix
Oct  8 18:48:52 compute-0 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc180 irq 14 lpm-pol 0
Oct  8 18:48:52 compute-0 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc188 irq 15 lpm-pol 0
Oct  8 18:48:52 compute-0 systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  8 18:48:52 compute-0 systemd[1]: Reached target Initrd Root Device.
Oct  8 18:48:52 compute-0 kernel: ata1: found unknown device (class 0)
Oct  8 18:48:52 compute-0 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct  8 18:48:52 compute-0 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct  8 18:48:52 compute-0 systemd-udevd[473]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:48:52 compute-0 systemd-udevd[481]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:48:52 compute-0 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct  8 18:48:52 compute-0 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct  8 18:48:52 compute-0 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct  8 18:48:52 compute-0 systemd[1]: Finished dracut initqueue hook.
Oct  8 18:48:52 compute-0 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  8 18:48:52 compute-0 systemd[1]: Reached target Remote Encrypted Volumes.
Oct  8 18:48:52 compute-0 systemd[1]: Reached target Remote File Systems.
Oct  8 18:48:52 compute-0 systemd[1]: Starting dracut pre-mount hook...
Oct  8 18:48:52 compute-0 systemd[1]: Finished dracut pre-mount hook.
Oct  8 18:48:52 compute-0 systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct  8 18:48:52 compute-0 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Oct  8 18:48:52 compute-0 systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  8 18:48:52 compute-0 systemd[1]: Mounting /sysroot...
Oct  8 18:48:53 compute-0 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct  8 18:48:53 compute-0 kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct  8 18:48:53 compute-0 kernel: XFS (vda1): Ending clean mount
Oct  8 18:48:53 compute-0 systemd[1]: Mounted /sysroot.
Oct  8 18:48:53 compute-0 systemd[1]: Reached target Initrd Root File System.
Oct  8 18:48:53 compute-0 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct  8 18:48:53 compute-0 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct  8 18:48:53 compute-0 systemd[1]: Reached target Initrd File Systems.
Oct  8 18:48:53 compute-0 systemd[1]: Reached target Initrd Default Target.
Oct  8 18:48:53 compute-0 systemd[1]: Starting dracut mount hook...
Oct  8 18:48:53 compute-0 systemd[1]: Finished dracut mount hook.
Oct  8 18:48:53 compute-0 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct  8 18:48:53 compute-0 rpc.idmapd[448]: exiting on signal 15
Oct  8 18:48:53 compute-0 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct  8 18:48:53 compute-0 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Network.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Timer Units.
Oct  8 18:48:53 compute-0 systemd[1]: dbus.socket: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct  8 18:48:53 compute-0 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Initrd Default Target.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Basic System.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Initrd Root Device.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Initrd /usr File System.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Path Units.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Remote File Systems.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Slice Units.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Socket Units.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target System Initialization.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Local File Systems.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Swaps.
Oct  8 18:48:53 compute-0 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped dracut mount hook.
Oct  8 18:48:53 compute-0 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped dracut pre-mount hook.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped target Local Encrypted Volumes.
Oct  8 18:48:53 compute-0 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct  8 18:48:53 compute-0 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped dracut initqueue hook.
Oct  8 18:48:53 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Oct  8 18:48:53 compute-0 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Create Volatile Files and Directories.
Oct  8 18:48:53 compute-0 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Coldplug All udev Devices.
Oct  8 18:48:53 compute-0 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped dracut pre-trigger hook.
Oct  8 18:48:53 compute-0 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct  8 18:48:53 compute-0 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Setup Virtual Console.
Oct  8 18:48:53 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct  8 18:48:53 compute-0 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct  8 18:48:53 compute-0 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Closed udev Control Socket.
Oct  8 18:48:53 compute-0 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Closed udev Kernel Socket.
Oct  8 18:48:53 compute-0 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped dracut pre-udev hook.
Oct  8 18:48:53 compute-0 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped dracut cmdline hook.
Oct  8 18:48:53 compute-0 systemd[1]: Starting Cleanup udev Database...
Oct  8 18:48:53 compute-0 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct  8 18:48:53 compute-0 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Create List of Static Device Nodes.
Oct  8 18:48:53 compute-0 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Stopped Create System Users.
Oct  8 18:48:53 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct  8 18:48:53 compute-0 systemd[1]: Finished Cleanup udev Database.
Oct  8 18:48:53 compute-0 systemd[1]: Reached target Switch Root.
Oct  8 18:48:53 compute-0 systemd[1]: Starting Switch Root...
Oct  8 18:48:53 compute-0 systemd[1]: Switching root.
Oct  8 18:48:53 compute-0 systemd-journald[305]: Journal stopped
Oct  8 18:48:55 compute-0 systemd-journald: Received SIGTERM from PID 1 (systemd).
Oct  8 18:48:55 compute-0 kernel: audit: type=1404 audit(1759949334.256:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct  8 18:48:55 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 18:48:55 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct  8 18:48:55 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 18:48:55 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct  8 18:48:55 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 18:48:55 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 18:48:55 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 18:48:55 compute-0 kernel: audit: type=1403 audit(1759949334.413:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct  8 18:48:55 compute-0 systemd: Successfully loaded SELinux policy in 161.369ms.
Oct  8 18:48:55 compute-0 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 37.390ms.
Oct  8 18:48:55 compute-0 systemd: systemd 252-57.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  8 18:48:55 compute-0 systemd: Detected virtualization kvm.
Oct  8 18:48:55 compute-0 systemd: Detected architecture x86-64.
Oct  8 18:48:55 compute-0 systemd: Hostname set to <compute-0>.
Oct  8 18:48:55 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:48:55 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:48:55 compute-0 systemd: initrd-switch-root.service: Deactivated successfully.
Oct  8 18:48:55 compute-0 systemd: Stopped Switch Root.
Oct  8 18:48:55 compute-0 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct  8 18:48:55 compute-0 systemd: Created slice Slice /system/getty.
Oct  8 18:48:55 compute-0 systemd: Created slice Slice /system/serial-getty.
Oct  8 18:48:55 compute-0 systemd: Created slice Slice /system/sshd-keygen.
Oct  8 18:48:55 compute-0 systemd: Created slice User and Session Slice.
Oct  8 18:48:55 compute-0 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  8 18:48:55 compute-0 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct  8 18:48:55 compute-0 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct  8 18:48:55 compute-0 systemd: Reached target Local Encrypted Volumes.
Oct  8 18:48:55 compute-0 systemd: Stopped target Switch Root.
Oct  8 18:48:55 compute-0 systemd: Stopped target Initrd File Systems.
Oct  8 18:48:55 compute-0 systemd: Stopped target Initrd Root File System.
Oct  8 18:48:55 compute-0 systemd: Reached target Local Integrity Protected Volumes.
Oct  8 18:48:55 compute-0 systemd: Reached target Path Units.
Oct  8 18:48:55 compute-0 systemd: Reached target rpc_pipefs.target.
Oct  8 18:48:55 compute-0 systemd: Reached target Slice Units.
Oct  8 18:48:55 compute-0 systemd: Reached target Local Verity Protected Volumes.
Oct  8 18:48:55 compute-0 systemd: Listening on Device-mapper event daemon FIFOs.
Oct  8 18:48:55 compute-0 systemd: Listening on LVM2 poll daemon socket.
Oct  8 18:48:55 compute-0 systemd: Listening on RPCbind Server Activation Socket.
Oct  8 18:48:55 compute-0 systemd: Reached target RPC Port Mapper.
Oct  8 18:48:55 compute-0 systemd: Listening on Process Core Dump Socket.
Oct  8 18:48:55 compute-0 systemd: Listening on initctl Compatibility Named Pipe.
Oct  8 18:48:55 compute-0 systemd: Listening on udev Control Socket.
Oct  8 18:48:55 compute-0 systemd: Listening on udev Kernel Socket.
Oct  8 18:48:55 compute-0 systemd: Mounting Huge Pages File System...
Oct  8 18:48:55 compute-0 systemd: Mounting /dev/hugepages1G...
Oct  8 18:48:55 compute-0 systemd: Mounting /dev/hugepages2M...
Oct  8 18:48:55 compute-0 systemd: Mounting POSIX Message Queue File System...
Oct  8 18:48:55 compute-0 systemd: Mounting Kernel Debug File System...
Oct  8 18:48:55 compute-0 systemd: Mounting Kernel Trace File System...
Oct  8 18:48:55 compute-0 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  8 18:48:55 compute-0 systemd: Starting Create List of Static Device Nodes...
Oct  8 18:48:55 compute-0 systemd: Load legacy module configuration was skipped because no trigger condition checks were met.
Oct  8 18:48:55 compute-0 systemd: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct  8 18:48:55 compute-0 systemd: Starting Load Kernel Module configfs...
Oct  8 18:48:55 compute-0 systemd: Starting Load Kernel Module drm...
Oct  8 18:48:55 compute-0 systemd: Starting Load Kernel Module efi_pstore...
Oct  8 18:48:55 compute-0 systemd: Starting Load Kernel Module fuse...
Oct  8 18:48:55 compute-0 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct  8 18:48:55 compute-0 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct  8 18:48:55 compute-0 systemd: Stopped File System Check on Root Device.
Oct  8 18:48:55 compute-0 systemd: Stopped Journal Service.
Oct  8 18:48:55 compute-0 kernel: fuse: init (API version 7.37)
Oct  8 18:48:55 compute-0 systemd: Starting Journal Service...
Oct  8 18:48:55 compute-0 systemd: Starting Load Kernel Modules...
Oct  8 18:48:55 compute-0 systemd: Starting Generate network units from Kernel command line...
Oct  8 18:48:55 compute-0 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 18:48:55 compute-0 systemd: Starting Remount Root and Kernel File Systems...
Oct  8 18:48:55 compute-0 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct  8 18:48:55 compute-0 systemd: Starting Coldplug All udev Devices...
Oct  8 18:48:55 compute-0 systemd-journald[688]: Journal started
Oct  8 18:48:55 compute-0 systemd-journald[688]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  8 18:48:55 compute-0 systemd[1]: Queued start job for default target Multi-User System.
Oct  8 18:48:55 compute-0 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct  8 18:48:55 compute-0 systemd: Started Journal Service.
Oct  8 18:48:55 compute-0 systemd[1]: Mounted Huge Pages File System.
Oct  8 18:48:55 compute-0 kernel: ACPI: bus type drm_connector registered
Oct  8 18:48:55 compute-0 systemd[1]: Mounted /dev/hugepages1G.
Oct  8 18:48:55 compute-0 systemd[1]: Mounted /dev/hugepages2M.
Oct  8 18:48:55 compute-0 systemd[1]: Mounted POSIX Message Queue File System.
Oct  8 18:48:55 compute-0 systemd[1]: Mounted Kernel Debug File System.
Oct  8 18:48:55 compute-0 systemd[1]: Mounted Kernel Trace File System.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Create List of Static Device Nodes.
Oct  8 18:48:55 compute-0 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module configfs.
Oct  8 18:48:55 compute-0 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module drm.
Oct  8 18:48:55 compute-0 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct  8 18:48:55 compute-0 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module fuse.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Generate network units from Kernel command line.
Oct  8 18:48:55 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct  8 18:48:55 compute-0 systemd[1]: Mounting FUSE Control File System...
Oct  8 18:48:55 compute-0 systemd[1]: Mounted FUSE Control File System.
Oct  8 18:48:55 compute-0 kernel: Bridge firewalling registered
Oct  8 18:48:55 compute-0 systemd-modules-load[689]: Inserted module 'br_netfilter'
Oct  8 18:48:55 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct  8 18:48:55 compute-0 systemd-modules-load[689]: Inserted module 'nf_conntrack'
Oct  8 18:48:55 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct  8 18:48:55 compute-0 systemd[1]: Starting Apply Kernel Variables...
Oct  8 18:48:55 compute-0 systemd[1]: Finished Apply Kernel Variables.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Coldplug All udev Devices.
Oct  8 18:48:55 compute-0 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct  8 18:48:55 compute-0 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct  8 18:48:55 compute-0 systemd[1]: Activating swap /swap...
Oct  8 18:48:55 compute-0 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  8 18:48:55 compute-0 systemd[1]: Rebuild Hardware Database was skipped because of an unmet condition check (ConditionNeedsUpdate=/etc).
Oct  8 18:48:55 compute-0 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct  8 18:48:55 compute-0 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct  8 18:48:55 compute-0 systemd[1]: Starting Load/Save OS Random Seed...
Oct  8 18:48:55 compute-0 systemd[1]: Create System Users was skipped because no trigger condition checks were met.
Oct  8 18:48:55 compute-0 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  8 18:48:55 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct  8 18:48:55 compute-0 systemd[1]: Activated swap /swap.
Oct  8 18:48:55 compute-0 systemd[1]: Reached target Swaps.
Oct  8 18:48:55 compute-0 systemd-journald[688]: Time spent on flushing to /var/log/journal/42833e1b511a402df82cb9cb2fc36491 is 15.633ms for 786 entries.
Oct  8 18:48:55 compute-0 systemd-journald[688]: System Journal (/var/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 4.0G, 3.9G free.
Oct  8 18:48:55 compute-0 systemd-journald[688]: Received client request to flush runtime journal.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Load/Save OS Random Seed.
Oct  8 18:48:55 compute-0 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  8 18:48:55 compute-0 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  8 18:48:55 compute-0 systemd[1]: Reached target Preparation for Local File Systems.
Oct  8 18:48:55 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct  8 18:48:55 compute-0 systemd[1]: Reached target Local File Systems.
Oct  8 18:48:55 compute-0 systemd[1]: Starting Import network configuration from initramfs...
Oct  8 18:48:55 compute-0 systemd[1]: Rebuild Dynamic Linker Cache was skipped because no trigger condition checks were met.
Oct  8 18:48:55 compute-0 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct  8 18:48:55 compute-0 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct  8 18:48:55 compute-0 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct  8 18:48:55 compute-0 systemd[1]: Starting Automatic Boot Loader Update...
Oct  8 18:48:55 compute-0 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct  8 18:48:55 compute-0 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  8 18:48:55 compute-0 bootctl[705]: Couldn't find EFI system partition, skipping.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Automatic Boot Loader Update.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Import network configuration from initramfs.
Oct  8 18:48:55 compute-0 systemd-udevd[706]: Using default interface naming scheme 'rhel-9.0'.
Oct  8 18:48:55 compute-0 systemd[1]: Starting Create Volatile Files and Directories...
Oct  8 18:48:55 compute-0 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  8 18:48:55 compute-0 systemd[1]: Starting Load Kernel Module configfs...
Oct  8 18:48:55 compute-0 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  8 18:48:55 compute-0 systemd[1]: Finished Load Kernel Module configfs.
Oct  8 18:48:55 compute-0 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct  8 18:48:55 compute-0 systemd-udevd[741]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:48:55 compute-0 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct  8 18:48:55 compute-0 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct  8 18:48:55 compute-0 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct  8 18:48:55 compute-0 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct  8 18:48:55 compute-0 systemd[1]: Finished Create Volatile Files and Directories.
Oct  8 18:48:55 compute-0 systemd[1]: Starting Security Auditing Service...
Oct  8 18:48:55 compute-0 systemd[1]: Starting RPC Bind...
Oct  8 18:48:55 compute-0 systemd[1]: Rebuild Journal Catalog was skipped because of an unmet condition check (ConditionNeedsUpdate=/var).
Oct  8 18:48:55 compute-0 systemd[1]: Update is Completed was skipped because no trigger condition checks were met.
Oct  8 18:48:55 compute-0 auditd[775]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct  8 18:48:55 compute-0 auditd[775]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct  8 18:48:55 compute-0 systemd-udevd[723]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:48:55 compute-0 systemd[1]: Started RPC Bind.
Oct  8 18:48:55 compute-0 augenrules[781]: /sbin/augenrules: No change
Oct  8 18:48:55 compute-0 augenrules[799]: No rules
Oct  8 18:48:55 compute-0 augenrules[799]: enabled 1
Oct  8 18:48:55 compute-0 augenrules[799]: failure 1
Oct  8 18:48:55 compute-0 augenrules[799]: pid 775
Oct  8 18:48:55 compute-0 augenrules[799]: rate_limit 0
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_limit 8192
Oct  8 18:48:55 compute-0 augenrules[799]: lost 0
Oct  8 18:48:55 compute-0 augenrules[799]: backlog 3
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_wait_time 60000
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_wait_time_actual 0
Oct  8 18:48:55 compute-0 augenrules[799]: enabled 1
Oct  8 18:48:55 compute-0 augenrules[799]: failure 1
Oct  8 18:48:55 compute-0 augenrules[799]: pid 775
Oct  8 18:48:55 compute-0 augenrules[799]: rate_limit 0
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_limit 8192
Oct  8 18:48:55 compute-0 augenrules[799]: lost 0
Oct  8 18:48:55 compute-0 augenrules[799]: backlog 0
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_wait_time 60000
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_wait_time_actual 0
Oct  8 18:48:55 compute-0 augenrules[799]: enabled 1
Oct  8 18:48:55 compute-0 augenrules[799]: failure 1
Oct  8 18:48:55 compute-0 augenrules[799]: pid 775
Oct  8 18:48:55 compute-0 augenrules[799]: rate_limit 0
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_limit 8192
Oct  8 18:48:55 compute-0 augenrules[799]: lost 0
Oct  8 18:48:55 compute-0 augenrules[799]: backlog 0
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_wait_time 60000
Oct  8 18:48:55 compute-0 augenrules[799]: backlog_wait_time_actual 0
Oct  8 18:48:55 compute-0 systemd[1]: Started Security Auditing Service.
Oct  8 18:48:55 compute-0 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct  8 18:48:55 compute-0 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct  8 18:48:55 compute-0 kernel: Console: switching to colour dummy device 80x25
Oct  8 18:48:55 compute-0 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct  8 18:48:55 compute-0 kernel: [drm] features: -context_init
Oct  8 18:48:55 compute-0 kernel: [drm] number of scanouts: 1
Oct  8 18:48:55 compute-0 kernel: [drm] number of cap sets: 0
Oct  8 18:48:55 compute-0 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct  8 18:48:55 compute-0 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct  8 18:48:55 compute-0 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct  8 18:48:55 compute-0 kernel: kvm_amd: TSC scaling supported
Oct  8 18:48:55 compute-0 kernel: kvm_amd: Nested Virtualization enabled
Oct  8 18:48:55 compute-0 kernel: kvm_amd: Nested Paging enabled
Oct  8 18:48:55 compute-0 kernel: kvm_amd: LBR virtualization supported
Oct  8 18:48:55 compute-0 kernel: Console: switching to colour frame buffer device 128x48
Oct  8 18:48:55 compute-0 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct  8 18:48:55 compute-0 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct  8 18:48:56 compute-0 systemd[1]: Reached target System Initialization.
Oct  8 18:48:56 compute-0 systemd[1]: Started dnf makecache --timer.
Oct  8 18:48:56 compute-0 systemd[1]: Started Daily rotation of log files.
Oct  8 18:48:56 compute-0 systemd[1]: Started Run system activity accounting tool every 10 minutes.
Oct  8 18:48:56 compute-0 systemd[1]: Started Generate summary of yesterday's process accounting.
Oct  8 18:48:56 compute-0 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct  8 18:48:56 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct  8 18:48:56 compute-0 systemd[1]: Reached target Timer Units.
Oct  8 18:48:56 compute-0 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct  8 18:48:56 compute-0 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct  8 18:48:56 compute-0 systemd[1]: Reached target Socket Units.
Oct  8 18:48:56 compute-0 systemd[1]: Starting D-Bus System Message Bus...
Oct  8 18:48:56 compute-0 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 18:48:56 compute-0 systemd[1]: Started D-Bus System Message Bus.
Oct  8 18:48:56 compute-0 systemd[1]: Reached target Basic System.
Oct  8 18:48:56 compute-0 dbus-broker-lau[835]: Ready
Oct  8 18:48:56 compute-0 systemd[1]: Starting NTP client/server...
Oct  8 18:48:56 compute-0 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct  8 18:48:56 compute-0 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct  8 18:48:56 compute-0 systemd[1]: Started irqbalance daemon.
Oct  8 18:48:56 compute-0 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct  8 18:48:56 compute-0 systemd[1]: Starting Create netns directory...
Oct  8 18:48:56 compute-0 systemd[1]: Starting Netfilter Tables...
Oct  8 18:48:56 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 18:48:56 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 18:48:56 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 18:48:56 compute-0 systemd[1]: Reached target sshd-keygen.target.
Oct  8 18:48:56 compute-0 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct  8 18:48:56 compute-0 systemd[1]: Reached target User and Group Name Lookups.
Oct  8 18:48:56 compute-0 systemd[1]: Starting Resets System Activity Logs...
Oct  8 18:48:56 compute-0 systemd[1]: Starting User Login Management...
Oct  8 18:48:56 compute-0 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct  8 18:48:56 compute-0 systemd[1]: Finished Resets System Activity Logs.
Oct  8 18:48:56 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 18:48:56 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 18:48:56 compute-0 systemd[1]: Finished Create netns directory.
Oct  8 18:48:56 compute-0 chronyd[850]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  8 18:48:56 compute-0 chronyd[850]: Frequency -28.709 +/- 0.114 ppm read from /var/lib/chrony/drift
Oct  8 18:48:56 compute-0 chronyd[850]: Loaded seccomp filter (level 2)
Oct  8 18:48:56 compute-0 systemd[1]: Started NTP client/server.
Oct  8 18:48:56 compute-0 systemd-logind[844]: New seat seat0.
Oct  8 18:48:56 compute-0 systemd-logind[844]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  8 18:48:56 compute-0 systemd-logind[844]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  8 18:48:56 compute-0 systemd[1]: Started User Login Management.
Oct  8 18:48:56 compute-0 systemd[1]: Finished Netfilter Tables.
Oct  8 18:48:57 compute-0 cloud-init[870]: Cloud-init v. 24.4-7.el9 running 'init-local' at Wed, 08 Oct 2025 18:48:57 +0000. Up 8.27 seconds.
Oct  8 18:48:57 compute-0 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct  8 18:48:57 compute-0 systemd[1]: Reached target Preparation for Network.
Oct  8 18:48:57 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Oct  8 18:48:57 compute-0 chown[872]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct  8 18:48:58 compute-0 ovs-ctl[877]: Starting ovsdb-server [  OK  ]
Oct  8 18:48:58 compute-0 ovs-vsctl[926]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  8 18:48:58 compute-0 ovs-vsctl[936]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"47f81f7a-64d8-418a-a74c-b879bd6deb83\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  8 18:48:58 compute-0 ovs-ctl[877]: Configuring Open vSwitch system IDs [  OK  ]
Oct  8 18:48:58 compute-0 ovs-vsctl[942]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct  8 18:48:58 compute-0 ovs-ctl[877]: Enabling remote OVSDB managers [  OK  ]
Oct  8 18:48:58 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Oct  8 18:48:58 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct  8 18:48:58 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct  8 18:48:58 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  8 18:48:58 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Oct  8 18:48:58 compute-0 ovs-ctl[987]: Inserting openvswitch module [  OK  ]
Oct  8 18:48:58 compute-0 kernel: ovs-system: entered promiscuous mode
Oct  8 18:48:58 compute-0 kernel: Timeout policy base is empty
Oct  8 18:48:58 compute-0 systemd-udevd[1012]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:48:58 compute-0 kernel: vlan22: entered promiscuous mode
Oct  8 18:48:58 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  8 18:48:58 compute-0 systemd-udevd[1013]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:48:58 compute-0 kernel: vlan20: entered promiscuous mode
Oct  8 18:48:58 compute-0 systemd-udevd[1022]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:48:58 compute-0 kernel: vlan21: entered promiscuous mode
Oct  8 18:48:58 compute-0 ovs-ctl[956]: Starting ovs-vswitchd [  OK  ]
Oct  8 18:48:58 compute-0 ovs-vsctl[1033]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Oct  8 18:48:58 compute-0 ovs-ctl[956]: Enabling remote OVSDB managers [  OK  ]
Oct  8 18:48:58 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  8 18:48:58 compute-0 systemd[1]: Starting Open vSwitch...
Oct  8 18:48:58 compute-0 systemd[1]: Finished Open vSwitch.
Oct  8 18:48:58 compute-0 systemd[1]: Starting Network Manager...
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.8736] NetworkManager (version 1.54.1-1.el9) is starting... (boot:d538b0ba-483c-4d09-9bda-0412f54534f3)
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.8742] Read config: /etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.8925] manager[0x55b9c6fb6040]: monitoring kernel firmware directory '/lib/firmware'.
Oct  8 18:48:58 compute-0 systemd[1]: Starting Hostname Service...
Oct  8 18:48:58 compute-0 systemd[1]: Started Hostname Service.
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9706] hostname: hostname: using hostnamed
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9706] hostname: static hostname changed from (none) to "compute-0"
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9714] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9876] manager[0x55b9c6fb6040]: rfkill: Wi-Fi hardware radio set enabled
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9876] manager[0x55b9c6fb6040]: rfkill: WWAN hardware radio set enabled
Oct  8 18:48:58 compute-0 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct  8 18:48:58 compute-0 NetworkManager[1035]: <info>  [1759949338.9984] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0023] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0024] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0025] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0026] manager: Networking is enabled by state file
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0033] settings: Loaded settings plugin: keyfile (internal)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0064] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0157] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0179] dhcp: init: Using DHCP client 'internal'
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0181] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0191] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0202] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 18:48:59 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0220] device (lo): Activation: starting connection 'lo' (aed4deb5-95bc-489e-9824-933efef54b8c)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0227] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0229] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0247] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/3)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0249] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0262] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/4)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0264] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0278] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/5)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0280] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0293] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/6)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0297] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0310] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0312] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0317] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0319] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0324] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/9)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0325] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0330] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0332] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0337] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/11)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0340] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0345] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/12)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0347] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0352] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/13)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0354] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0374] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  8 18:48:59 compute-0 systemd[1]: Started Network Manager.
Oct  8 18:48:59 compute-0 systemd[1]: Reached target Network.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0391] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0403] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0405] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0406] device (eth0): carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0407] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0408] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0409] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0410] device (eth1): carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0415] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0421] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0425] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 18:48:59 compute-0 kernel: vlan20: left promiscuous mode
Oct  8 18:48:59 compute-0 systemd[1]: Starting Network Manager Wait Online...
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <error> [1759949339.0441] platform-linux: sysctl: failed to set '/proc/sys/net/ipv6/conf/vlan20/disable_ipv6' to '1': (2) No such file or directory
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0454] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0461] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0464] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0469] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0470] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0472] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0473] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0474] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0488] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0496] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0498] policy: auto-activating connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0500] policy: auto-activating connection 'eth1-port' (19cf4192-6bcb-444b-ae5b-b65ed7eb80f5)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0501] policy: auto-activating connection 'vlan20-port' (250e055d-e2d6-47d8-a97e-ae3fcd2ad51e)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0502] policy: auto-activating connection 'br-ex-port' (8546a6a0-ff01-4e77-9a80-0ccb84b09e15)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0503] policy: auto-activating connection 'vlan21-port' (ae6c2a0f-6e02-482e-bde7-7c80cfee7790)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0503] policy: auto-activating connection 'br-ex-br' (e5fe6336-0393-4f39-89c9-10707afd900f)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0504] policy: auto-activating connection 'vlan22-port' (f99e4b34-6f01-46fc-9977-6ae5bcc3a7ce)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0507] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0513] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0515] device (eth1): Activation: starting connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0517] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (19cf4192-6bcb-444b-ae5b-b65ed7eb80f5)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0519] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (250e055d-e2d6-47d8-a97e-ae3fcd2ad51e)
Oct  8 18:48:59 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0532] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (8546a6a0-ff01-4e77-9a80-0ccb84b09e15)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0533] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ae6c2a0f-6e02-482e-bde7-7c80cfee7790)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0538] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (e5fe6336-0393-4f39-89c9-10707afd900f)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0540] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (f99e4b34-6f01-46fc-9977-6ae5bcc3a7ce)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0543] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0545] manager: NetworkManager state is now CONNECTING
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0546] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0551] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0567] device (eth1): state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0572] device (eth1): disconnecting for new activation request.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0572] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0574] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0576] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0578] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0581] device (br-ex)[Open vSwitch Port]: state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0585] device (br-ex)[Open vSwitch Port]: disconnecting for new activation request.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0587] device (eth1)[Open vSwitch Port]: state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0595] device (eth1)[Open vSwitch Port]: disconnecting for new activation request.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0596] device (vlan20)[Open vSwitch Port]: state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0602] device (vlan20)[Open vSwitch Port]: disconnecting for new activation request.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0603] device (vlan21)[Open vSwitch Port]: state change: prepare -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0609] device (vlan21)[Open vSwitch Port]: disconnecting for new activation request.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0609] device (vlan22)[Open vSwitch Port]: state change: disconnected -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0616] device (vlan22)[Open vSwitch Port]: disconnecting for new activation request.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0617] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0619] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0621] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0625] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 kernel: vlan21: left promiscuous mode
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0630] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0656] device (eth1): disconnecting for new activation request.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0660] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0665] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0666] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0713] device (eth1): Activation: starting connection 'ci-private-network' (f659475b-7c6f-5319-b371-519bc515c6f0)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0716] device (br-ex)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0720] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (8546a6a0-ff01-4e77-9a80-0ccb84b09e15)
Oct  8 18:48:59 compute-0 kernel: vlan22: left promiscuous mode
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0755] dhcp4 (eth0): state changed new lease, address=38.102.83.120
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0763] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0803] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0813] policy: auto-activating connection 'vlan20-if' (7ec5abe9-78ff-4cf6-b429-5715d559f836)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0815] policy: auto-activating connection 'vlan21-if' (de041523-e9cd-491b-892a-fc68198a2acd)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0817] policy: auto-activating connection 'vlan22-if' (52d34180-b43f-4c96-a950-d37fc7c59cd0)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0818] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0825] device (lo): Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0839] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 18:48:59 compute-0 systemd[1]: Started GSSAPI Proxy Daemon.
Oct  8 18:48:59 compute-0 kernel: virtio_net virtio5 eth1: left promiscuous mode
Oct  8 18:48:59 compute-0 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0859] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 systemd[1]: Reached target NFS client services.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0864] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0867] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0869] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0870] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0871] device (eth1)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  8 18:48:59 compute-0 systemd[1]: Reached target Remote File Systems.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0902] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (19cf4192-6bcb-444b-ae5b-b65ed7eb80f5)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0906] device (vlan20)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0915] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (250e055d-e2d6-47d8-a97e-ae3fcd2ad51e)
Oct  8 18:48:59 compute-0 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  8 18:48:59 compute-0 kernel: ovs-system: left promiscuous mode
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0920] device (vlan21)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0929] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ae6c2a0f-6e02-482e-bde7-7c80cfee7790)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0933] device (vlan22)[Open vSwitch Port]: state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0941] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (f99e4b34-6f01-46fc-9977-6ae5bcc3a7ce)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0944] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.0952] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1035] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1043] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1052] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (7ec5abe9-78ff-4cf6-b429-5715d559f836)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1054] policy: auto-activating connection 'vlan21-if' (de041523-e9cd-491b-892a-fc68198a2acd)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1060] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1073] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1080] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1086] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1089] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1092] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1099] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1103] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1107] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1110] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1116] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1120] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1122] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1125] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1132] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1136] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1139] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1141] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1149] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1162] policy: auto-activating connection 'vlan22-if' (52d34180-b43f-4c96-a950-d37fc7c59cd0)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1165] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1170] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1175] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1182] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1186] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1199] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1207] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1215] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (de041523-e9cd-491b-892a-fc68198a2acd)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1216] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 kernel: ovs-system: entered promiscuous mode
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1228] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1234] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1240] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1246] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 kernel: No such timeout policy "ovs_test_tp"
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1253] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1257] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1262] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (52d34180-b43f-4c96-a950-d37fc7c59cd0)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1262] policy: auto-activating connection 'br-ex-if' (794f8bbe-0f95-43e1-a59b-5efcb30fbf56)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1265] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1272] device (eth0): Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1279] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1283] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1287] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1293] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1298] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1301] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1311] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1320] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 kernel: vlan20: entered promiscuous mode
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1348] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1351] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1362] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (794f8bbe-0f95-43e1-a59b-5efcb30fbf56)
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1362] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1368] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1376] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1379] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1380] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1385] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1391] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1394] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1396] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1403] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1407] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1411] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 kernel: vlan21: entered promiscuous mode
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1417] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1424] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1431] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1455] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1463] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct  8 18:48:59 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1472] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1477] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1483] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1501] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1504] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1509] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1530] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1543] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1554] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1562] device (eth1): Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1573] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1577] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1583] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1592] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1608] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1645] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1647] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1653] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 kernel: br-ex: entered promiscuous mode
Oct  8 18:48:59 compute-0 kernel: vlan22: entered promiscuous mode
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1744] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1755] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1769] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1771] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1777] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1844] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1854] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1881] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1882] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1887] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  8 18:48:59 compute-0 NetworkManager[1035]: <info>  [1759949339.1894] manager: startup complete
Oct  8 18:48:59 compute-0 systemd[1]: Finished Network Manager Wait Online.
Oct  8 18:48:59 compute-0 systemd[1]: Starting Cloud-init: Network Stage...
Oct  8 18:48:59 compute-0 systemd[1]: Starting Authorization Manager...
Oct  8 18:48:59 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  8 18:48:59 compute-0 polkitd[1188]: Started polkitd version 0.117
Oct  8 18:48:59 compute-0 systemd[1]: Started Authorization Manager.
Oct  8 18:48:59 compute-0 cloud-init[1253]: Cloud-init v. 24.4-7.el9 running 'init' at Wed, 08 Oct 2025 18:48:59 +0000. Up 10.12 seconds.
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +------------+-------+-----------------+---------------+--------+-------------------+
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   Device   |   Up  |     Address     |      Mask     | Scope  |     Hw-Address    |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +------------+-------+-----------------+---------------+--------+-------------------+
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   br-ex    |  True | 192.168.122.100 | 255.255.255.0 | global | fa:16:3e:7a:c4:21 |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |    eth0    |  True |  38.102.83.120  | 255.255.255.0 | global | fa:16:3e:22:ef:71 |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |    eth1    |  True |        .        |       .       |   .    | fa:16:3e:7a:c4:21 |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |     lo     |  True |    127.0.0.1    |   255.0.0.0   |  host  |         .         |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |     lo     |  True |     ::1/128     |       .       |  host  |         .         |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: | ovs-system | False |        .        |       .       |   .    | 96:5c:8d:a9:f1:5f |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   vlan20   |  True |   172.17.0.100  | 255.255.255.0 | global | 0a:84:82:fa:ae:3b |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   vlan21   |  True |   172.18.0.100  | 255.255.255.0 | global | 16:5f:9c:c1:6f:96 |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   vlan22   |  True |   172.19.0.100  | 255.255.255.0 | global | 16:5b:f6:63:52:30 |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +------------+-------+-----------------+---------------+--------+-------------------+
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   3   |    172.17.0.0   |    0.0.0.0    |  255.255.255.0  |   vlan20  |   U   |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   4   |    172.18.0.0   |    0.0.0.0    |  255.255.255.0  |   vlan21  |   U   |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   5   |    172.19.0.0   |    0.0.0.0    |  255.255.255.0  |   vlan22  |   U   |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   6   |  192.168.122.0  |    0.0.0.0    |  255.255.255.0  |   br-ex   |   U   |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: |   2   |  multicast  |    ::   |    eth1   |   U   |
Oct  8 18:48:59 compute-0 cloud-init[1253]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  8 18:48:59 compute-0 systemd[1]: Finished Cloud-init: Network Stage.
Oct  8 18:48:59 compute-0 systemd[1]: Reached target Cloud-config availability.
Oct  8 18:48:59 compute-0 systemd[1]: Reached target Network is Online.
Oct  8 18:49:00 compute-0 systemd[1]: Starting Cloud-init: Config Stage...
Oct  8 18:49:00 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Oct  8 18:49:00 compute-0 systemd[1]: Starting Notify NFS peers of a restart...
Oct  8 18:49:00 compute-0 systemd[1]: Starting System Logging Service...
Oct  8 18:49:00 compute-0 sm-notify[1287]: Version 2.5.4 starting
Oct  8 18:49:00 compute-0 systemd[1]: Starting OpenSSH server daemon...
Oct  8 18:49:00 compute-0 systemd[1]: Starting Permit User Sessions...
Oct  8 18:49:00 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Oct  8 18:49:00 compute-0 systemd[1]: Started Notify NFS peers of a restart.
Oct  8 18:49:00 compute-0 systemd[1]: Started OpenSSH server daemon.
Oct  8 18:49:00 compute-0 systemd[1]: Finished Permit User Sessions.
Oct  8 18:49:00 compute-0 systemd[1]: Started Command Scheduler.
Oct  8 18:49:00 compute-0 systemd[1]: Started Getty on tty1.
Oct  8 18:49:00 compute-0 systemd[1]: Started Serial Getty on ttyS0.
Oct  8 18:49:00 compute-0 systemd[1]: Reached target Login Prompts.
Oct  8 18:49:00 compute-0 rsyslogd[1288]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1288" x-info="https://www.rsyslog.com"] start
Oct  8 18:49:00 compute-0 systemd[1]: Started System Logging Service.
Oct  8 18:49:00 compute-0 systemd[1]: Reached target Multi-User System.
Oct  8 18:49:00 compute-0 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct  8 18:49:00 compute-0 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct  8 18:49:00 compute-0 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct  8 18:49:00 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 18:49:00 compute-0 cloud-init[1300]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Wed, 08 Oct 2025 18:49:00 +0000. Up 10.94 seconds.
Oct  8 18:49:00 compute-0 systemd[1]: Finished Cloud-init: Config Stage.
Oct  8 18:49:00 compute-0 systemd[1]: Starting Cloud-init: Final Stage...
Oct  8 18:49:00 compute-0 cloud-init[1304]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Wed, 08 Oct 2025 18:49:00 +0000. Up 11.31 seconds.
Oct  8 18:49:00 compute-0 cloud-init[1304]: Cloud-init v. 24.4-7.el9 finished at Wed, 08 Oct 2025 18:49:00 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.36 seconds
Oct  8 18:49:00 compute-0 systemd[1]: Finished Cloud-init: Final Stage.
Oct  8 18:49:00 compute-0 systemd[1]: Reached target Cloud-init target.
Oct  8 18:49:00 compute-0 systemd[1]: Startup finished in 1.658s (kernel) + 3.210s (initrd) + 6.550s (userspace) = 11.419s.
Oct  8 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 25 affinity: Operation not permitted
Oct  8 18:49:06 compute-0 irqbalance[840]: IRQ 25 affinity is now unmanaged
Oct  8 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 31 affinity: Operation not permitted
Oct  8 18:49:06 compute-0 irqbalance[840]: IRQ 31 affinity is now unmanaged
Oct  8 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 28 affinity: Operation not permitted
Oct  8 18:49:06 compute-0 irqbalance[840]: IRQ 28 affinity is now unmanaged
Oct  8 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 26 affinity: Operation not permitted
Oct  8 18:49:06 compute-0 irqbalance[840]: IRQ 26 affinity is now unmanaged
Oct  8 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 32 affinity: Operation not permitted
Oct  8 18:49:06 compute-0 irqbalance[840]: IRQ 32 affinity is now unmanaged
Oct  8 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 30 affinity: Operation not permitted
Oct  8 18:49:06 compute-0 irqbalance[840]: IRQ 30 affinity is now unmanaged
Oct  8 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 29 affinity: Operation not permitted
Oct  8 18:49:06 compute-0 irqbalance[840]: IRQ 29 affinity is now unmanaged
Oct  8 18:49:06 compute-0 irqbalance[840]: Cannot change IRQ 27 affinity: Operation not permitted
Oct  8 18:49:06 compute-0 irqbalance[840]: IRQ 27 affinity is now unmanaged
Oct  8 18:49:09 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 18:49:29 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 18:49:40 compute-0 systemd[1]: Created slice User Slice of UID 1000.
Oct  8 18:49:40 compute-0 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  8 18:49:40 compute-0 systemd-logind[844]: New session 1 of user zuul.
Oct  8 18:49:40 compute-0 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  8 18:49:40 compute-0 systemd[1]: Starting User Manager for UID 1000...
Oct  8 18:49:40 compute-0 systemd[1314]: Queued start job for default target Main User Target.
Oct  8 18:49:40 compute-0 systemd[1314]: Created slice User Application Slice.
Oct  8 18:49:40 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 18:49:40 compute-0 systemd[1314]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 18:49:40 compute-0 systemd[1314]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 18:49:40 compute-0 systemd[1314]: Reached target Paths.
Oct  8 18:49:40 compute-0 systemd[1314]: Reached target Timers.
Oct  8 18:49:40 compute-0 systemd[1314]: Starting D-Bus User Message Bus Socket...
Oct  8 18:49:40 compute-0 systemd[1314]: Starting Create User's Volatile Files and Directories...
Oct  8 18:49:40 compute-0 systemd[1314]: Finished Create User's Volatile Files and Directories.
Oct  8 18:49:40 compute-0 systemd[1314]: Listening on D-Bus User Message Bus Socket.
Oct  8 18:49:40 compute-0 systemd[1314]: Reached target Sockets.
Oct  8 18:49:40 compute-0 systemd[1314]: Reached target Basic System.
Oct  8 18:49:40 compute-0 systemd[1314]: Reached target Main User Target.
Oct  8 18:49:40 compute-0 systemd[1314]: Startup finished in 184ms.
Oct  8 18:49:40 compute-0 systemd[1]: Started User Manager for UID 1000.
Oct  8 18:49:40 compute-0 systemd[1]: Started Session 1 of User zuul.
Oct  8 18:49:41 compute-0 python3.9[1539]: ansible-ansible.builtin.file Invoked with path=/var/lib/openstack/reboot_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:49:42 compute-0 systemd[1]: session-1.scope: Deactivated successfully.
Oct  8 18:49:42 compute-0 systemd-logind[844]: Session 1 logged out. Waiting for processes to exit.
Oct  8 18:49:42 compute-0 systemd-logind[844]: Removed session 1.
Oct  8 18:49:47 compute-0 systemd-logind[844]: New session 3 of user zuul.
Oct  8 18:49:47 compute-0 systemd[1]: Started Session 3 of User zuul.
Oct  8 18:49:48 compute-0 python3.9[1717]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:49:50 compute-0 python3.9[1873]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:49:50 compute-0 python3.9[2025]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:49:51 compute-0 python3.9[2177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:49:52 compute-0 python3.9[2300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949390.5945773-65-224496674355243/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=be98bc5d9b8806bf64c094b8d28e699f7676d168 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:49:53 compute-0 python3.9[2452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:49:53 compute-0 python3.9[2575]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949392.3440707-65-113709593892587/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=597e518065fbb1bf8e8180d3d00442be9597b2b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:49:54 compute-0 python3.9[2727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:49:55 compute-0 python3.9[2850]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949393.6744804-65-9693660716769/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=1281659b32c29a1dc7e0f0a384629b6f851ea22a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:49:56 compute-0 python3.9[3002]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:49:56 compute-0 python3.9[3154]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:49:57 compute-0 python3.9[3306]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:49:58 compute-0 python3.9[3429]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949396.6421862-124-137391660181529/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=7aeb83cb2a0f60617f03db5305a25fcf55b55f32 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:49:59 compute-0 python3.9[3581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:49:59 compute-0 python3.9[3704]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949398.0560448-124-156558864807952/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=c26f6e5193be2786a8b8bb5189b3cb6e2b9477fa backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:00 compute-0 systemd[1]: Starting system activity accounting tool...
Oct  8 18:50:00 compute-0 systemd[1]: sysstat-collect.service: Deactivated successfully.
Oct  8 18:50:00 compute-0 systemd[1]: Finished system activity accounting tool.
Oct  8 18:50:00 compute-0 python3.9[3858]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:00 compute-0 python3.9[3981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949399.3501544-124-146296131647806/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=bfbcf20cb82d842ee9779ffe8c4e17c06873b8ee backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:01 compute-0 python3.9[4133]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:02 compute-0 python3.9[4285]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:03 compute-0 python3.9[4437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:03 compute-0 python3.9[4560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949402.181615-183-38451611957925/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=02896da65fce18a168e7910f5af7526a79efb8fe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:04 compute-0 python3.9[4712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:05 compute-0 python3.9[4835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949403.507737-183-53996794758040/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=dbbb1b6fdf947d6757e9a97b4db0f0fed02fc7bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:05 compute-0 python3.9[4987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:06 compute-0 python3.9[5110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949404.8134964-183-94886104920985/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=0a8ddb07cb526effa9222003433072d0e2469723 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:07 compute-0 python3.9[5262]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:08 compute-0 python3.9[5414]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:08 compute-0 python3.9[5566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:09 compute-0 python3.9[5689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949407.829225-242-126511778725729/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=e59bd502e03ac84f245d8f11272237c8e4c2985c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:10 compute-0 python3.9[5841]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:10 compute-0 python3.9[5964]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949409.1662827-242-75650193509197/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=dbbb1b6fdf947d6757e9a97b4db0f0fed02fc7bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:11 compute-0 python3.9[6116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:12 compute-0 python3.9[6239]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949410.5229716-242-213434339985290/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=bdb025fff65dd78b3338c3645348e6625a6566d3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:13 compute-0 python3.9[6391]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:14 compute-0 python3.9[6543]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:14 compute-0 python3.9[6666]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949412.9994059-310-123677732071107/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:15 compute-0 python3.9[6818]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:16 compute-0 python3.9[6970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:16 compute-0 python3.9[7093]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949415.107704-334-53778534495422/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:17 compute-0 python3.9[7245]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:18 compute-0 python3.9[7397]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:18 compute-0 python3.9[7520]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949417.1053324-358-47357061822012/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:19 compute-0 python3.9[7672]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:20 compute-0 python3.9[7824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:20 compute-0 python3.9[7947]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949419.272474-382-10723025164268/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:21 compute-0 python3.9[8099]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:22 compute-0 python3.9[8251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:23 compute-0 python3.9[8374]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949421.4911299-406-205720063418029/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:23 compute-0 python3.9[8526]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:24 compute-0 python3.9[8678]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:25 compute-0 python3.9[8801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949423.6835752-430-106338857731098/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:26 compute-0 python3.9[8953]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:26 compute-0 python3.9[9105]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:27 compute-0 python3.9[9228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949425.9541163-454-52092794050268/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=77fff1c6b7e11d6b8bf60629262eb6aa0aa1c835 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:28 compute-0 systemd[1]: session-3.scope: Deactivated successfully.
Oct  8 18:50:28 compute-0 systemd[1]: session-3.scope: Consumed 32.973s CPU time.
Oct  8 18:50:28 compute-0 systemd-logind[844]: Session 3 logged out. Waiting for processes to exit.
Oct  8 18:50:28 compute-0 systemd-logind[844]: Removed session 3.
Oct  8 18:50:33 compute-0 systemd-logind[844]: New session 4 of user zuul.
Oct  8 18:50:33 compute-0 systemd[1]: Started Session 4 of User zuul.
Oct  8 18:50:35 compute-0 python3.9[9406]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:50:36 compute-0 python3.9[9562]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:36 compute-0 python3.9[9714]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:50:37 compute-0 python3.9[9864]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:50:38 compute-0 python3.9[10016]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  8 18:50:42 compute-0 dbus-broker-launch[836]: avc:  op=load_policy lsm=selinux seqno=2 res=1
Oct  8 18:50:42 compute-0 python3.9[10172]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 18:50:43 compute-0 python3.9[10256]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 18:50:47 compute-0 python3.9[10409]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 18:50:48 compute-0 python3[10564]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct  8 18:50:49 compute-0 python3.9[10716]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:50 compute-0 python3.9[10868]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:50 compute-0 python3.9[10946]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:51 compute-0 python3.9[11098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:52 compute-0 python3.9[11176]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.3chpsua9 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:52 compute-0 python3.9[11328]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:53 compute-0 python3.9[11406]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:54 compute-0 python3.9[11558]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:50:54 compute-0 python3[11711]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  8 18:50:55 compute-0 python3.9[11863]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:56 compute-0 python3.9[11988]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949454.6626768-157-214285420327743/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:57 compute-0 python3.9[12140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:57 compute-0 python3.9[12265]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949456.2449355-172-183391960342928/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:58 compute-0 python3.9[12417]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:50:59 compute-0 python3.9[12542]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949457.5821939-187-181303124878137/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:50:59 compute-0 python3.9[12694]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:00 compute-0 python3.9[12819]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949458.9373817-202-41985063136530/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:01 compute-0 python3.9[12971]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:01 compute-0 python3.9[13096]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949460.2715518-217-258282885353684/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:02 compute-0 python3.9[13248]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:03 compute-0 python3.9[13400]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:04 compute-0 python3.9[13555]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:05 compute-0 python3.9[13707]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:05 compute-0 python3.9[13860]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:51:06 compute-0 python3.9[14014]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:07 compute-0 python3.9[14169]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:08 compute-0 python3.9[14319]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:51:09 compute-0 python3.9[14472]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:09 compute-0 ovs-vsctl[14473]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct  8 18:51:10 compute-0 python3.9[14625]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:11 compute-0 python3.9[14780]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:11 compute-0 ovs-vsctl[14781]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct  8 18:51:12 compute-0 python3.9[14931]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:51:12 compute-0 python3.9[15085]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:13 compute-0 python3.9[15237]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:14 compute-0 python3.9[15315]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:14 compute-0 python3.9[15467]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:15 compute-0 python3.9[15545]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:16 compute-0 python3.9[15697]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:16 compute-0 python3.9[15849]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:17 compute-0 python3.9[15927]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:18 compute-0 python3.9[16079]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:18 compute-0 python3.9[16157]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:18 compute-0 chronyd[850]: Selected source 162.159.200.1 (pool.ntp.org)
Oct  8 18:51:19 compute-0 python3.9[16309]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:51:19 compute-0 systemd[1]: Reloading.
Oct  8 18:51:19 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:51:19 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:51:20 compute-0 python3.9[16497]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:21 compute-0 python3.9[16575]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:21 compute-0 python3.9[16727]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:22 compute-0 python3.9[16805]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:23 compute-0 python3.9[16957]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:51:23 compute-0 systemd[1]: Reloading.
Oct  8 18:51:23 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:51:23 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:51:23 compute-0 systemd[1]: Starting Create netns directory...
Oct  8 18:51:23 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 18:51:23 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 18:51:23 compute-0 systemd[1]: Finished Create netns directory.
Oct  8 18:51:24 compute-0 python3.9[17150]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:25 compute-0 python3.9[17302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:25 compute-0 python3.9[17425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949484.4718456-468-184137831269943/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:26 compute-0 python3.9[17577]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:27 compute-0 python3.9[17729]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:27 compute-0 python3.9[17852]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949486.7217233-493-125235074874130/.source.json _original_basename=.80d_off9 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:28 compute-0 python3.9[18004]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:30 compute-0 python3.9[18431]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct  8 18:51:31 compute-0 python3.9[18583]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 18:51:32 compute-0 python3.9[18735]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  8 18:51:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3189792475-merged.mount: Deactivated successfully.
Oct  8 18:51:33 compute-0 kernel: evm: overlay not supported
Oct  8 18:51:33 compute-0 podman[18736]: 2025-10-08 18:51:33.16962068 +0000 UTC m=+0.204498082 system refresh
Oct  8 18:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 18:51:34 compute-0 python3[18904]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 18:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 18:51:34 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  8 18:51:34 compute-0 podman[18940]: 2025-10-08 18:51:34.674681157 +0000 UTC m=+0.026348267 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct  8 18:51:34 compute-0 podman[18940]: 2025-10-08 18:51:34.817700723 +0000 UTC m=+0.169367783 container create 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Oct  8 18:51:34 compute-0 python3[18904]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857
Oct  8 18:51:35 compute-0 python3.9[19134]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:51:36 compute-0 python3.9[19288]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:37 compute-0 python3.9[19364]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:51:37 compute-0 python3.9[19515]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759949497.1920602-581-162758505078983/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:51:38 compute-0 python3.9[19591]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 18:51:38 compute-0 systemd[1]: Reloading.
Oct  8 18:51:38 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:51:38 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:51:39 compute-0 python3.9[19702]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:51:39 compute-0 systemd[1]: Reloading.
Oct  8 18:51:39 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:51:39 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:51:39 compute-0 systemd[1]: Starting ovn_controller container...
Oct  8 18:51:40 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct  8 18:51:40 compute-0 systemd[1]: Started libcrun container.
Oct  8 18:51:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f3dedad02a36764af72efb60d6a2f065ff7fe559c6f3cfd8cec593e78849fb5/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  8 18:51:40 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.
Oct  8 18:51:40 compute-0 podman[19743]: 2025-10-08 18:51:40.374599402 +0000 UTC m=+0.398483181 container init 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 18:51:40 compute-0 podman[19743]: 2025-10-08 18:51:40.414980941 +0000 UTC m=+0.438864720 container start 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 18:51:40 compute-0 edpm-start-podman-container[19743]: ovn_controller
Oct  8 18:51:40 compute-0 ovn_controller[19759]: + sudo -E kolla_set_configs
Oct  8 18:51:40 compute-0 edpm-start-podman-container[19742]: Creating additional drop-in dependency for "ovn_controller" (4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59)
Oct  8 18:51:40 compute-0 systemd[1]: Reloading.
Oct  8 18:51:40 compute-0 podman[19764]: 2025-10-08 18:51:40.577116265 +0000 UTC m=+0.151030146 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct  8 18:51:40 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:51:40 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:51:40 compute-0 systemd[1]: 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59-1f119a9749d9ea73.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 18:51:40 compute-0 systemd[1]: 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59-1f119a9749d9ea73.service: Failed with result 'exit-code'.
Oct  8 18:51:40 compute-0 systemd[1]: Started ovn_controller container.
Oct  8 18:51:40 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct  8 18:51:40 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  8 18:51:40 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  8 18:51:40 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct  8 18:51:40 compute-0 systemd[1314]: Starting Mark boot as successful...
Oct  8 18:51:40 compute-0 systemd[1314]: Finished Mark boot as successful.
Oct  8 18:51:40 compute-0 systemd[19844]: Queued start job for default target Main User Target.
Oct  8 18:51:40 compute-0 systemd[19844]: Created slice User Application Slice.
Oct  8 18:51:40 compute-0 systemd[19844]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  8 18:51:40 compute-0 systemd[19844]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 18:51:40 compute-0 systemd[19844]: Reached target Paths.
Oct  8 18:51:40 compute-0 systemd[19844]: Reached target Timers.
Oct  8 18:51:40 compute-0 systemd[19844]: Starting D-Bus User Message Bus Socket...
Oct  8 18:51:40 compute-0 systemd[19844]: Starting Create User's Volatile Files and Directories...
Oct  8 18:51:40 compute-0 systemd[19844]: Finished Create User's Volatile Files and Directories.
Oct  8 18:51:40 compute-0 systemd[19844]: Listening on D-Bus User Message Bus Socket.
Oct  8 18:51:40 compute-0 systemd[19844]: Reached target Sockets.
Oct  8 18:51:40 compute-0 systemd[19844]: Reached target Basic System.
Oct  8 18:51:40 compute-0 systemd[19844]: Reached target Main User Target.
Oct  8 18:51:40 compute-0 systemd[19844]: Startup finished in 99ms.
Oct  8 18:51:40 compute-0 systemd[1]: Started User Manager for UID 0.
Oct  8 18:51:40 compute-0 systemd[1]: Started Session c1 of User root.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 18:51:41 compute-0 ovn_controller[19759]: INFO:__main__:Validating config file
Oct  8 18:51:41 compute-0 ovn_controller[19759]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 18:51:41 compute-0 ovn_controller[19759]: INFO:__main__:Writing out command to execute
Oct  8 18:51:41 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: ++ cat /run_command
Oct  8 18:51:41 compute-0 ovn_controller[19759]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  8 18:51:41 compute-0 ovn_controller[19759]: + ARGS=
Oct  8 18:51:41 compute-0 ovn_controller[19759]: + sudo kolla_copy_cacerts
Oct  8 18:51:41 compute-0 systemd[1]: Started Session c2 of User root.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: + [[ ! -n '' ]]
Oct  8 18:51:41 compute-0 ovn_controller[19759]: + . kolla_extend_start
Oct  8 18:51:41 compute-0 ovn_controller[19759]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  8 18:51:41 compute-0 ovn_controller[19759]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct  8 18:51:41 compute-0 ovn_controller[19759]: + umask 0022
Oct  8 18:51:41 compute-0 ovn_controller[19759]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct  8 18:51:41 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct  8 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3924] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct  8 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3933] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct  8 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3949] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct  8 18:51:41 compute-0 kernel: br-int: entered promiscuous mode
Oct  8 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3958] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct  8 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.3973] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00022|main|INFO|OVS feature set changed, force recompute.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 18:51:41 compute-0 ovn_controller[19759]: 2025-10-08T18:51:41Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  8 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.4166] manager: (ovn-98a9aa-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct  8 18:51:41 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Oct  8 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.4395] device (genev_sys_6081): carrier: link connected
Oct  8 18:51:41 compute-0 NetworkManager[1035]: <info>  [1759949501.4398] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Oct  8 18:51:41 compute-0 systemd-udevd[20027]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:51:41 compute-0 systemd-udevd[20030]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:51:41 compute-0 python3.9[20016]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:41 compute-0 ovs-vsctl[20033]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct  8 18:51:42 compute-0 python3.9[20185]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:42 compute-0 ovs-vsctl[20187]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct  8 18:51:43 compute-0 python3.9[20340]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:51:43 compute-0 ovs-vsctl[20341]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct  8 18:51:43 compute-0 systemd[1]: session-4.scope: Deactivated successfully.
Oct  8 18:51:43 compute-0 systemd[1]: session-4.scope: Consumed 50.988s CPU time.
Oct  8 18:51:43 compute-0 systemd-logind[844]: Session 4 logged out. Waiting for processes to exit.
Oct  8 18:51:43 compute-0 systemd-logind[844]: Removed session 4.
Oct  8 18:51:49 compute-0 systemd-logind[844]: New session 6 of user zuul.
Oct  8 18:51:49 compute-0 systemd[1]: Started Session 6 of User zuul.
Oct  8 18:51:50 compute-0 python3.9[20519]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:51:51 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct  8 18:51:51 compute-0 systemd[19844]: Activating special unit Exit the Session...
Oct  8 18:51:51 compute-0 systemd[19844]: Stopped target Main User Target.
Oct  8 18:51:51 compute-0 systemd[19844]: Stopped target Basic System.
Oct  8 18:51:51 compute-0 systemd[19844]: Stopped target Paths.
Oct  8 18:51:51 compute-0 systemd[19844]: Stopped target Sockets.
Oct  8 18:51:51 compute-0 systemd[19844]: Stopped target Timers.
Oct  8 18:51:51 compute-0 systemd[19844]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 18:51:51 compute-0 systemd[19844]: Closed D-Bus User Message Bus Socket.
Oct  8 18:51:51 compute-0 systemd[19844]: Stopped Create User's Volatile Files and Directories.
Oct  8 18:51:51 compute-0 systemd[19844]: Removed slice User Application Slice.
Oct  8 18:51:51 compute-0 systemd[19844]: Reached target Shutdown.
Oct  8 18:51:51 compute-0 systemd[19844]: Finished Exit the Session.
Oct  8 18:51:51 compute-0 systemd[19844]: Reached target Exit the Session.
Oct  8 18:51:51 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct  8 18:51:51 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct  8 18:51:51 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  8 18:51:51 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  8 18:51:51 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  8 18:51:51 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  8 18:51:51 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct  8 18:51:51 compute-0 python3.9[20677]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:52 compute-0 python3.9[20829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:53 compute-0 python3.9[20981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:54 compute-0 python3.9[21133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:54 compute-0 python3.9[21285]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:55 compute-0 python3.9[21435]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:51:56 compute-0 python3.9[21587]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  8 18:51:58 compute-0 python3.9[21737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:51:59 compute-0 python3.9[21858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949517.6913862-86-53439925079176/.source follow=False _original_basename=haproxy.j2 checksum=4bca74f6ee0b6450624d22997e2f90c414d58b44 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:51:59 compute-0 python3.9[22009]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:00 compute-0 python3.9[22130]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949519.285496-101-219849640860557/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:01 compute-0 python3.9[22282]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 18:52:02 compute-0 python3.9[22366]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 18:52:04 compute-0 python3.9[22519]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 18:52:05 compute-0 python3.9[22672]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:06 compute-0 python3.9[22793]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949524.8942306-138-171508789279123/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:06 compute-0 python3.9[22943]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:07 compute-0 python3.9[23064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949526.2239397-138-260886726275346/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:08 compute-0 python3.9[23214]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:09 compute-0 python3.9[23335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949528.1397173-182-155786115814026/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:10 compute-0 python3.9[23485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:10 compute-0 python3.9[23606]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949529.519808-182-110034891505854/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:11 compute-0 ovn_controller[19759]: 2025-10-08T18:52:11Z|00025|memory|INFO|16128 kB peak resident set size after 29.9 seconds
Oct  8 18:52:11 compute-0 ovn_controller[19759]: 2025-10-08T18:52:11Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Oct  8 18:52:11 compute-0 podman[23730]: 2025-10-08 18:52:11.325229137 +0000 UTC m=+0.106607422 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 18:52:11 compute-0 python3.9[23769]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:52:12 compute-0 python3.9[23936]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:13 compute-0 python3.9[24088]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:13 compute-0 python3.9[24166]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:14 compute-0 python3.9[24318]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:14 compute-0 python3.9[24396]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:15 compute-0 python3.9[24548]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:16 compute-0 python3.9[24700]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:17 compute-0 python3.9[24780]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:17 compute-0 python3.9[24933]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:18 compute-0 python3.9[25011]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:19 compute-0 python3.9[25163]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:52:19 compute-0 systemd[1]: Reloading.
Oct  8 18:52:19 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:52:19 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:52:20 compute-0 python3.9[25354]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:20 compute-0 python3.9[25432]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:21 compute-0 python3.9[25584]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:21 compute-0 python3.9[25662]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:22 compute-0 python3.9[25815]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:52:22 compute-0 systemd[1]: Reloading.
Oct  8 18:52:23 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:52:23 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:52:23 compute-0 systemd[1]: Starting Create netns directory...
Oct  8 18:52:23 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 18:52:23 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 18:52:23 compute-0 systemd[1]: Finished Create netns directory.
Oct  8 18:52:24 compute-0 python3.9[26008]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:25 compute-0 python3.9[26160]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:25 compute-0 python3.9[26285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949544.3555024-333-262336270419425/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:26 compute-0 python3.9[26438]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:52:27 compute-0 python3.9[26590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:52:27 compute-0 python3.9[26713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949546.7347236-358-278074340770702/.source.json _original_basename=.vi1hzfr1 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:28 compute-0 python3.9[26865]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:31 compute-0 python3.9[27294]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct  8 18:52:32 compute-0 python3.9[27446]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 18:52:33 compute-0 python3.9[27598]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  8 18:52:34 compute-0 python3[27778]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 18:52:34 compute-0 podman[27817]: 2025-10-08 18:52:34.899525688 +0000 UTC m=+0.038155287 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 18:52:35 compute-0 podman[27817]: 2025-10-08 18:52:35.168556674 +0000 UTC m=+0.307186203 container create 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  8 18:52:35 compute-0 python3[27778]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 18:52:36 compute-0 python3.9[28009]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:52:36 compute-0 python3.9[28163]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:37 compute-0 python3.9[28240]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:52:38 compute-0 python3.9[28391]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759949557.613259-446-219813364589479/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:52:39 compute-0 python3.9[28467]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 18:52:39 compute-0 systemd[1]: Reloading.
Oct  8 18:52:39 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:52:39 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:52:40 compute-0 python3.9[28579]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:52:40 compute-0 systemd[1]: Reloading.
Oct  8 18:52:40 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:52:40 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:52:40 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Oct  8 18:52:40 compute-0 systemd[1]: Started libcrun container.
Oct  8 18:52:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62ed697a92c2e27de8b9b8411fab5fe0db1b62146968c509eda8ca855c6aea8b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct  8 18:52:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62ed697a92c2e27de8b9b8411fab5fe0db1b62146968c509eda8ca855c6aea8b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 18:52:40 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.
Oct  8 18:52:40 compute-0 podman[28621]: 2025-10-08 18:52:40.626118484 +0000 UTC m=+0.268880153 container init 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + sudo -E kolla_set_configs
Oct  8 18:52:40 compute-0 podman[28621]: 2025-10-08 18:52:40.658825273 +0000 UTC m=+0.301586852 container start 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 18:52:40 compute-0 edpm-start-podman-container[28621]: ovn_metadata_agent
Oct  8 18:52:40 compute-0 podman[28642]: 2025-10-08 18:52:40.737692508 +0000 UTC m=+0.064666118 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 18:52:40 compute-0 edpm-start-podman-container[28620]: Creating additional drop-in dependency for "ovn_metadata_agent" (80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a)
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Validating config file
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Copying service configuration files
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Writing out command to execute
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: ++ cat /run_command
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + CMD=neutron-ovn-metadata-agent
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + ARGS=
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + sudo kolla_copy_cacerts
Oct  8 18:52:40 compute-0 systemd[1]: Reloading.
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + [[ ! -n '' ]]
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + . kolla_extend_start
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: Running command: 'neutron-ovn-metadata-agent'
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + umask 0022
Oct  8 18:52:40 compute-0 ovn_metadata_agent[28637]: + exec neutron-ovn-metadata-agent
Oct  8 18:52:40 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:52:40 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:52:41 compute-0 systemd[1]: Started ovn_metadata_agent container.
Oct  8 18:52:41 compute-0 systemd-logind[844]: Session 6 logged out. Waiting for processes to exit.
Oct  8 18:52:41 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Oct  8 18:52:41 compute-0 systemd[1]: session-6.scope: Consumed 40.272s CPU time.
Oct  8 18:52:41 compute-0 systemd-logind[844]: Removed session 6.
Oct  8 18:52:41 compute-0 podman[28749]: 2025-10-08 18:52:41.603246714 +0000 UTC m=+0.142465711 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.165 28643 INFO neutron.common.config [-] Logging enabled!#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.166 28643 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.167 28643 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.168 28643 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.169 28643 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.170 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.171 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.172 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.173 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.174 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.175 28643 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.176 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.177 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.178 28643 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.179 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.180 28643 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.181 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.182 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.183 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.184 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.185 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.186 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.187 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.188 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.189 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.190 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.191 28643 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.192 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.193 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.194 28643 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.195 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.196 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.197 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.198 28643 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.199 28643 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.207 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.208 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.208 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.208 28643 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.208 28643 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.219 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 47f81f7a-64d8-418a-a74c-b879bd6deb83 (UUID: 47f81f7a-64d8-418a-a74c-b879bd6deb83) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.243 28643 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.243 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.243 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.243 28643 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.246 28643 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.252 28643 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.256 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '47f81f7a-64d8-418a-a74c-b879bd6deb83'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], external_ids={}, name=47f81f7a-64d8-418a-a74c-b879bd6deb83, nb_cfg_timestamp=1759949509419, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.257 28643 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f9cb27900a0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.258 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.258 28643 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.258 28643 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.258 28643 INFO oslo_service.service [-] Starting 1 workers#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.262 28643 DEBUG oslo_service.service [-] Started child 28778 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.266 28643 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpk62ej7qg/privsep.sock']#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.267 28778 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-891676'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.303 28778 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.304 28778 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.304 28778 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.309 28778 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.317 28778 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.326 28778 INFO eventlet.wsgi.server [-] (28778) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Oct  8 18:52:44 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.992 28643 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.993 28643 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpk62ej7qg/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.876 28783 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.884 28783 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.887 28783 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.888 28783 INFO oslo.privsep.daemon [-] privsep daemon running as pid 28783#033[00m
Oct  8 18:52:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:44.998 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[99dc8d27-7c6b-487f-915a-3e2d20899944]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.431 28783 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.431 28783 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.431 28783 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.898 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3b2da1-2766-4d54-b42f-866a4d5ee3ca]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.901 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, column=external_ids, values=({'neutron:ovn-metadata-id': '848359ed-b94c-5960-a0fa-54c8b235d5a5'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.911 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.918 28643 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.919 28643 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.920 28643 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.920 28643 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.920 28643 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.920 28643 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.921 28643 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.921 28643 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.921 28643 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.921 28643 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.922 28643 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.922 28643 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.922 28643 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.923 28643 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.924 28643 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.924 28643 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.924 28643 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.925 28643 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.925 28643 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.925 28643 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.925 28643 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.926 28643 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.926 28643 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.926 28643 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.926 28643 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.927 28643 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.927 28643 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.927 28643 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.928 28643 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.928 28643 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.928 28643 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.929 28643 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.929 28643 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.929 28643 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.929 28643 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.930 28643 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.930 28643 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.930 28643 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.930 28643 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.931 28643 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.931 28643 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.931 28643 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.932 28643 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.932 28643 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.932 28643 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.932 28643 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.933 28643 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.933 28643 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.933 28643 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.934 28643 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.934 28643 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.934 28643 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.934 28643 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.935 28643 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.936 28643 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.936 28643 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.936 28643 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.936 28643 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.937 28643 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.937 28643 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.937 28643 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.937 28643 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.938 28643 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.938 28643 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.938 28643 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.938 28643 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.939 28643 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.939 28643 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.939 28643 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.939 28643 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.940 28643 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.940 28643 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.940 28643 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.940 28643 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.941 28643 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.941 28643 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.941 28643 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.941 28643 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.942 28643 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.943 28643 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.943 28643 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.943 28643 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.943 28643 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.944 28643 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.945 28643 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.945 28643 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.945 28643 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.945 28643 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.946 28643 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.946 28643 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.946 28643 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.947 28643 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.948 28643 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.948 28643 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.948 28643 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.949 28643 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.949 28643 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.949 28643 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.949 28643 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.950 28643 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.950 28643 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.950 28643 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.951 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.952 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.952 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.952 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.952 28643 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.953 28643 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.953 28643 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.953 28643 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.953 28643 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.954 28643 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.954 28643 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.954 28643 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.954 28643 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.955 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.956 28643 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.956 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.956 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.956 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.957 28643 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.958 28643 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.959 28643 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.959 28643 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.959 28643 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.959 28643 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.960 28643 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.961 28643 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.961 28643 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.961 28643 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.961 28643 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.962 28643 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.963 28643 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.964 28643 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.965 28643 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.966 28643 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.967 28643 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.968 28643 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.969 28643 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.970 28643 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.971 28643 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.972 28643 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.973 28643 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.974 28643 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.975 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.976 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.977 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.978 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.979 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 18:52:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:52:45.980 28643 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  8 18:52:46 compute-0 systemd-logind[844]: New session 7 of user zuul.
Oct  8 18:52:46 compute-0 systemd[1]: Started Session 7 of User zuul.
Oct  8 18:52:47 compute-0 python3.9[28941]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:52:49 compute-0 python3.9[29097]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:52:50 compute-0 python3.9[29262]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 18:52:50 compute-0 systemd[1]: Reloading.
Oct  8 18:52:50 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:52:50 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:52:51 compute-0 python3.9[29447]: ansible-ansible.builtin.service_facts Invoked
Oct  8 18:52:51 compute-0 network[29464]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 18:52:51 compute-0 network[29465]: 'network-scripts' will be removed from distribution in near future.
Oct  8 18:52:51 compute-0 network[29466]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 18:52:56 compute-0 python3.9[29730]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:52:57 compute-0 python3.9[29883]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:52:58 compute-0 python3.9[30036]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:52:58 compute-0 python3.9[30189]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:53:00 compute-0 python3.9[30342]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:53:01 compute-0 python3.9[30495]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:53:02 compute-0 python3.9[30648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:53:03 compute-0 python3.9[30801]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:04 compute-0 python3.9[30953]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:05 compute-0 python3.9[31105]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:05 compute-0 python3.9[31257]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:06 compute-0 python3.9[31409]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:07 compute-0 python3.9[31561]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:08 compute-0 python3.9[31713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:08 compute-0 python3.9[31865]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:09 compute-0 python3.9[32017]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:10 compute-0 python3.9[32169]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:11 compute-0 podman[32293]: 2025-10-08 18:53:11.053700088 +0000 UTC m=+0.095692829 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  8 18:53:11 compute-0 python3.9[32334]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:11 compute-0 podman[32464]: 2025-10-08 18:53:11.788803789 +0000 UTC m=+0.133670070 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 18:53:11 compute-0 python3.9[32511]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:12 compute-0 python3.9[32670]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:13 compute-0 python3.9[32822]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:53:14 compute-0 python3.9[32974]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:53:15 compute-0 python3.9[33126]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 18:53:16 compute-0 python3.9[33278]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 18:53:16 compute-0 systemd[1]: Reloading.
Oct  8 18:53:16 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:53:16 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:53:17 compute-0 python3.9[33465]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:53:18 compute-0 python3.9[33618]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:53:18 compute-0 python3.9[33771]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:53:19 compute-0 python3.9[33924]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:53:20 compute-0 python3.9[34077]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:53:21 compute-0 python3.9[34230]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:53:21 compute-0 python3.9[34383]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:53:22 compute-0 python3.9[34536]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct  8 18:53:23 compute-0 python3.9[34689]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 18:53:25 compute-0 python3.9[34847]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 18:53:26 compute-0 python3.9[35007]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 18:53:27 compute-0 python3.9[35091]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 18:53:41 compute-0 podman[35240]: 2025-10-08 18:53:41.679212269 +0000 UTC m=+0.080613400 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 18:53:42 compute-0 podman[35287]: 2025-10-08 18:53:42.743667235 +0000 UTC m=+0.160106964 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 18:53:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:53:44.211 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 18:53:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:53:44.212 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 18:53:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:53:44.212 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 18:53:57 compute-0 kernel: SELinux:  Converting 429 SID table entries...
Oct  8 18:53:57 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 18:53:57 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct  8 18:53:57 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 18:53:57 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct  8 18:53:57 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 18:53:57 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 18:53:57 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 18:54:06 compute-0 kernel: SELinux:  Converting 429 SID table entries...
Oct  8 18:54:06 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 18:54:06 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct  8 18:54:06 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 18:54:06 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct  8 18:54:06 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 18:54:06 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 18:54:06 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 18:54:12 compute-0 dbus-broker-launch[836]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Oct  8 18:54:12 compute-0 podman[35348]: 2025-10-08 18:54:12.673087206 +0000 UTC m=+0.079712062 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 18:54:13 compute-0 podman[35367]: 2025-10-08 18:54:13.748459586 +0000 UTC m=+0.161262780 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 18:54:43 compute-0 podman[47860]: 2025-10-08 18:54:43.655207006 +0000 UTC m=+0.063780487 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  8 18:54:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:54:44.212 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 18:54:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:54:44.213 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 18:54:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:54:44.213 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 18:54:44 compute-0 podman[48381]: 2025-10-08 18:54:44.686089339 +0000 UTC m=+0.106843647 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 18:55:04 compute-0 kernel: SELinux:  Converting 430 SID table entries...
Oct  8 18:55:04 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Oct  8 18:55:04 compute-0 kernel: SELinux:  policy capability open_perms=1
Oct  8 18:55:04 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Oct  8 18:55:04 compute-0 kernel: SELinux:  policy capability always_check_network=0
Oct  8 18:55:04 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  8 18:55:04 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  8 18:55:04 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  8 18:55:05 compute-0 dbus-broker-launch[835]: Noticed file-system modification, trigger reload.
Oct  8 18:55:05 compute-0 dbus-broker-launch[836]: avc:  op=load_policy lsm=selinux seqno=5 res=1
Oct  8 18:55:05 compute-0 dbus-broker-launch[835]: Noticed file-system modification, trigger reload.
Oct  8 18:55:13 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Oct  8 18:55:13 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Oct  8 18:55:13 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Oct  8 18:55:13 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Oct  8 18:55:13 compute-0 systemd[1]: Stopping sshd-keygen.target...
Oct  8 18:55:13 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 18:55:13 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 18:55:13 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  8 18:55:13 compute-0 systemd[1]: Reached target sshd-keygen.target.
Oct  8 18:55:13 compute-0 systemd[1]: Starting OpenSSH server daemon...
Oct  8 18:55:13 compute-0 systemd[1]: Started OpenSSH server daemon.
Oct  8 18:55:13 compute-0 podman[52973]: 2025-10-08 18:55:13.810598302 +0000 UTC m=+0.097082504 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 18:55:14 compute-0 podman[53105]: 2025-10-08 18:55:14.888327658 +0000 UTC m=+0.121141829 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  8 18:55:15 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 18:55:15 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct  8 18:55:15 compute-0 systemd[1]: Reloading.
Oct  8 18:55:15 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:15 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:16 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 18:55:19 compute-0 systemd[1]: Starting PackageKit Daemon...
Oct  8 18:55:19 compute-0 systemd[1]: Started PackageKit Daemon.
Oct  8 18:55:20 compute-0 python3.9[56781]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 18:55:20 compute-0 systemd[1]: Reloading.
Oct  8 18:55:20 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:20 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:21 compute-0 python3.9[57857]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 18:55:22 compute-0 systemd[1]: Reloading.
Oct  8 18:55:22 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:22 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:23 compute-0 python3.9[58936]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 18:55:23 compute-0 systemd[1]: Reloading.
Oct  8 18:55:23 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:23 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:24 compute-0 python3.9[59994]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 18:55:24 compute-0 systemd[1]: Reloading.
Oct  8 18:55:24 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:24 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:25 compute-0 python3.9[61216]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:25 compute-0 systemd[1]: Reloading.
Oct  8 18:55:25 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:25 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:26 compute-0 python3.9[62327]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:26 compute-0 systemd[1]: Reloading.
Oct  8 18:55:26 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:26 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 18:55:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct  8 18:55:27 compute-0 systemd[1]: man-db-cache-update.service: Consumed 13.141s CPU time.
Oct  8 18:55:27 compute-0 systemd[1]: run-r92d085bfb7384d62b55786ab9549f683.service: Deactivated successfully.
Oct  8 18:55:27 compute-0 python3.9[62773]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:28 compute-0 systemd[1]: Reloading.
Oct  8 18:55:28 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:28 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:29 compute-0 python3.9[62963]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:30 compute-0 python3.9[63118]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:30 compute-0 systemd[1]: Reloading.
Oct  8 18:55:30 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:30 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:31 compute-0 python3.9[63308]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  8 18:55:31 compute-0 systemd[1]: Reloading.
Oct  8 18:55:31 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:55:31 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:55:31 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Oct  8 18:55:31 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct  8 18:55:32 compute-0 python3.9[63502]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:33 compute-0 python3.9[63657]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:34 compute-0 python3.9[63812]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:35 compute-0 python3.9[63967]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:36 compute-0 python3.9[64122]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:37 compute-0 python3.9[64277]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:38 compute-0 python3.9[64432]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:39 compute-0 python3.9[64587]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:39 compute-0 systemd[1314]: Created slice User Background Tasks Slice.
Oct  8 18:55:39 compute-0 systemd[1314]: Starting Cleanup of User's Temporary Files and Directories...
Oct  8 18:55:39 compute-0 systemd[1314]: Finished Cleanup of User's Temporary Files and Directories.
Oct  8 18:55:40 compute-0 python3.9[64743]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:41 compute-0 python3.9[64898]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:41 compute-0 python3.9[65053]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:42 compute-0 python3.9[65208]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:43 compute-0 python3.9[65363]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:55:44.213 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 18:55:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:55:44.214 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 18:55:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:55:44.215 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 18:55:44 compute-0 podman[65490]: 2025-10-08 18:55:44.383477388 +0000 UTC m=+0.066556675 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  8 18:55:44 compute-0 python3.9[65537]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  8 18:55:45 compute-0 podman[65664]: 2025-10-08 18:55:45.537146004 +0000 UTC m=+0.097928709 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  8 18:55:45 compute-0 python3.9[65714]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:55:46 compute-0 python3.9[65871]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:55:47 compute-0 python3.9[66023]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:55:47 compute-0 python3.9[66175]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:55:48 compute-0 python3.9[66327]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:55:49 compute-0 python3.9[66479]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:55:50 compute-0 python3.9[66631]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:55:51 compute-0 python3.9[66756]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949749.6356785-554-3698774746160/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:55:51 compute-0 python3.9[66908]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:55:52 compute-0 python3.9[67033]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949751.4076536-554-212111276544674/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:55:53 compute-0 python3.9[67185]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:55:54 compute-0 python3.9[67310]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949752.837564-554-96647499880372/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:55:54 compute-0 python3.9[67462]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:55:55 compute-0 python3.9[67587]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949754.2665875-554-183634025307668/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:55:56 compute-0 python3.9[67739]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:55:57 compute-0 python3.9[67864]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949755.7638974-554-241627026644268/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:55:57 compute-0 python3.9[68016]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:55:58 compute-0 python3.9[68141]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949757.225823-554-105428098353981/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:55:59 compute-0 python3.9[68293]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:55:59 compute-0 python3.9[68416]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949758.588169-554-224101721322219/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:00 compute-0 python3.9[68568]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:01 compute-0 python3.9[68693]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759949759.906468-554-177064624935986/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:01 compute-0 python3.9[68845]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct  8 18:56:02 compute-0 python3.9[68998]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:03 compute-0 python3.9[69150]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:04 compute-0 python3.9[69302]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:04 compute-0 python3.9[69454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:05 compute-0 python3.9[69606]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:06 compute-0 python3.9[69758]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:07 compute-0 python3.9[69910]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:07 compute-0 python3.9[70062]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:08 compute-0 python3.9[70214]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:09 compute-0 python3.9[70366]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:10 compute-0 python3.9[70518]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:10 compute-0 python3.9[70670]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:11 compute-0 python3.9[70822]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:12 compute-0 python3.9[70974]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:13 compute-0 python3.9[71126]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:13 compute-0 python3.9[71249]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949772.6024625-775-64848924783671/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:14 compute-0 python3.9[71401]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:14 compute-0 podman[71402]: 2025-10-08 18:56:14.66658234 +0000 UTC m=+0.076192063 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  8 18:56:15 compute-0 python3.9[71543]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949773.9466448-775-174633968379587/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:15 compute-0 podman[71620]: 2025-10-08 18:56:15.663475415 +0000 UTC m=+0.086147839 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 18:56:15 compute-0 python3.9[71721]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:16 compute-0 python3.9[71844]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949775.455475-775-149693139539499/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:17 compute-0 python3.9[71996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:18 compute-0 python3.9[72119]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949776.8199081-775-186647715145854/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:18 compute-0 python3.9[72271]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:19 compute-0 python3.9[72394]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949778.3340044-775-33193879510605/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:20 compute-0 python3.9[72546]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:21 compute-0 python3.9[72669]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949779.8889744-775-38153509636633/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:21 compute-0 python3.9[72821]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:22 compute-0 python3.9[72944]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949781.3838897-775-120043933417754/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:23 compute-0 python3.9[73096]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:24 compute-0 python3.9[73219]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949782.798273-775-250926935508102/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:24 compute-0 python3.9[73371]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:25 compute-0 python3.9[73494]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949784.2555404-775-67613677167536/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:26 compute-0 python3.9[73646]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:26 compute-0 python3.9[73769]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949785.619427-775-141969523735536/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:27 compute-0 python3.9[73921]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:28 compute-0 python3.9[74044]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949787.020448-775-17254962897182/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:28 compute-0 python3.9[74196]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:29 compute-0 python3.9[74319]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949788.434147-775-46627806559662/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:30 compute-0 python3.9[74471]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:31 compute-0 python3.9[74594]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949789.8767738-775-227109886021303/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:31 compute-0 python3.9[74746]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:32 compute-0 python3.9[74869]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949791.1916087-775-170584705252857/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:33 compute-0 python3.9[75019]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:56:34 compute-0 python3.9[75174]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct  8 18:56:35 compute-0 dbus-broker-launch[836]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  8 18:56:36 compute-0 python3.9[75333]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:36 compute-0 python3.9[75485]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:37 compute-0 python3.9[75637]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:38 compute-0 python3.9[75789]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:39 compute-0 python3.9[75941]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:39 compute-0 python3.9[76093]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:40 compute-0 python3.9[76245]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:41 compute-0 python3.9[76397]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:42 compute-0 python3.9[76549]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:42 compute-0 python3.9[76701]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:43 compute-0 python3.9[76853]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 18:56:43 compute-0 systemd[1]: Reloading.
Oct  8 18:56:43 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:56:43 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:56:43 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Oct  8 18:56:43 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Oct  8 18:56:43 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Oct  8 18:56:43 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct  8 18:56:43 compute-0 systemd[1]: Starting libvirt logging daemon...
Oct  8 18:56:44 compute-0 systemd[1]: Started libvirt logging daemon.
Oct  8 18:56:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:56:44.214 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 18:56:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:56:44.215 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 18:56:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:56:44.215 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 18:56:44 compute-0 python3.9[77046]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 18:56:44 compute-0 systemd[1]: Reloading.
Oct  8 18:56:45 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:56:45 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:56:45 compute-0 podman[77048]: 2025-10-08 18:56:45.086525277 +0000 UTC m=+0.110779895 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 18:56:45 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct  8 18:56:45 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Oct  8 18:56:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct  8 18:56:45 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct  8 18:56:45 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct  8 18:56:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct  8 18:56:45 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct  8 18:56:45 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct  8 18:56:45 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct  8 18:56:45 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct  8 18:56:45 compute-0 podman[77230]: 2025-10-08 18:56:45.893784758 +0000 UTC m=+0.141313173 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 18:56:45 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct  8 18:56:45 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct  8 18:56:46 compute-0 python3.9[77310]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 18:56:46 compute-0 systemd[1]: Reloading.
Oct  8 18:56:46 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:56:46 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:56:46 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct  8 18:56:46 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct  8 18:56:46 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct  8 18:56:46 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct  8 18:56:46 compute-0 systemd[1]: Starting libvirt proxy daemon...
Oct  8 18:56:46 compute-0 systemd[1]: Started libvirt proxy daemon.
Oct  8 18:56:46 compute-0 setroubleshoot[77102]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l be07387d-8aeb-4798-8ef4-711c36244e22
Oct  8 18:56:46 compute-0 setroubleshoot[77102]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  8 18:56:46 compute-0 setroubleshoot[77102]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l be07387d-8aeb-4798-8ef4-711c36244e22
Oct  8 18:56:46 compute-0 setroubleshoot[77102]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  8 18:56:47 compute-0 python3.9[77527]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 18:56:47 compute-0 systemd[1]: Reloading.
Oct  8 18:56:47 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:56:47 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:56:47 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Oct  8 18:56:47 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Oct  8 18:56:47 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct  8 18:56:47 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct  8 18:56:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct  8 18:56:47 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct  8 18:56:47 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct  8 18:56:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct  8 18:56:47 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct  8 18:56:47 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct  8 18:56:47 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct  8 18:56:47 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct  8 18:56:48 compute-0 python3.9[77741]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 18:56:48 compute-0 systemd[1]: Reloading.
Oct  8 18:56:48 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:56:48 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:56:49 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Oct  8 18:56:49 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Oct  8 18:56:49 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Oct  8 18:56:49 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct  8 18:56:49 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct  8 18:56:49 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct  8 18:56:49 compute-0 systemd[1]: Starting libvirt secret daemon...
Oct  8 18:56:49 compute-0 systemd[1]: Started libvirt secret daemon.
Oct  8 18:56:50 compute-0 python3.9[77952]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:50 compute-0 python3.9[78104]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 18:56:51 compute-0 python3.9[78256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:51 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 18:56:51 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 18:56:52 compute-0 python3.9[78380]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949811.278163-1120-120463055275706/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:53 compute-0 python3.9[78532]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:54 compute-0 python3.9[78684]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:54 compute-0 python3.9[78762]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:55 compute-0 python3.9[78914]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:56 compute-0 python3.9[78992]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.eli36ciz recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:56 compute-0 python3.9[79144]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:56:56 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct  8 18:56:57 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct  8 18:56:57 compute-0 python3.9[79223]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:56:58 compute-0 python3.9[79375]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:56:58 compute-0 python3[79528]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  8 18:56:59 compute-0 python3.9[79680]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:00 compute-0 python3.9[79758]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:01 compute-0 python3.9[79910]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:01 compute-0 python3.9[79988]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:02 compute-0 python3.9[80140]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:03 compute-0 python3.9[80218]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:03 compute-0 python3.9[80370]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:04 compute-0 python3.9[80448]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:05 compute-0 python3.9[80600]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:05 compute-0 python3.9[80725]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759949824.6477885-1245-85778835292306/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:06 compute-0 python3.9[80877]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:07 compute-0 python3.9[81029]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:57:08 compute-0 python3.9[81184]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:09 compute-0 python3.9[81336]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:57:10 compute-0 python3.9[81489]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:57:11 compute-0 python3.9[81643]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:57:11 compute-0 python3.9[81798]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:12 compute-0 python3.9[81950]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:13 compute-0 python3.9[82073]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949832.035195-1317-144450595839219/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:13 compute-0 python3.9[82225]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:14 compute-0 python3.9[82348]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949833.3957467-1332-229620971419936/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:15 compute-0 python3.9[82500]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:15 compute-0 podman[82530]: 2025-10-08 18:57:15.666847851 +0000 UTC m=+0.075584825 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 18:57:16 compute-0 python3.9[82643]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949834.8769894-1347-261928793071585/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:16 compute-0 podman[82746]: 2025-10-08 18:57:16.69790287 +0000 UTC m=+0.113695311 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  8 18:57:17 compute-0 python3.9[82821]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:57:17 compute-0 systemd[1]: Reloading.
Oct  8 18:57:17 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:57:17 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:57:17 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Oct  8 18:57:18 compute-0 python3.9[83013]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  8 18:57:18 compute-0 systemd[1]: Reloading.
Oct  8 18:57:18 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:57:18 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:57:18 compute-0 systemd[1]: Reloading.
Oct  8 18:57:18 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:57:18 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:57:19 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Oct  8 18:57:19 compute-0 systemd[1]: session-7.scope: Consumed 3min 50.466s CPU time.
Oct  8 18:57:19 compute-0 systemd-logind[844]: Session 7 logged out. Waiting for processes to exit.
Oct  8 18:57:19 compute-0 systemd-logind[844]: Removed session 7.
Oct  8 18:57:24 compute-0 systemd-logind[844]: New session 8 of user zuul.
Oct  8 18:57:24 compute-0 systemd[1]: Started Session 8 of User zuul.
Oct  8 18:57:25 compute-0 python3.9[83262]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:57:27 compute-0 python3.9[83418]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:57:27 compute-0 python3.9[83570]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:57:28 compute-0 python3.9[83722]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:57:29 compute-0 python3.9[83874]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  8 18:57:30 compute-0 python3.9[84026]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:57:31 compute-0 python3.9[84178]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:57:32 compute-0 python3.9[84332]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:57:32 compute-0 systemd[1]: Reloading.
Oct  8 18:57:32 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:57:32 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:57:33 compute-0 python3.9[84521]: ansible-ansible.builtin.service_facts Invoked
Oct  8 18:57:33 compute-0 network[84538]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 18:57:33 compute-0 network[84539]: 'network-scripts' will be removed from distribution in near future.
Oct  8 18:57:33 compute-0 network[84540]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 18:57:38 compute-0 python3.9[84813]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:57:38 compute-0 systemd[1]: Reloading.
Oct  8 18:57:39 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:57:39 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:57:40 compute-0 python3.9[85000]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:57:41 compute-0 python3.9[85152]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  8 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.421983955 +0000 UTC m=+0.053951591 container create bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  8 18:57:41 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 18:57:41 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.4831] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/19)
Oct  8 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.401368236 +0000 UTC m=+0.033335892 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct  8 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct  8 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct  8 18:57:41 compute-0 kernel: veth0: entered allmulticast mode
Oct  8 18:57:41 compute-0 kernel: veth0: entered promiscuous mode
Oct  8 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Oct  8 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5111] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/20)
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5134] device (veth0): carrier: link connected
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5138] device (podman0): carrier: link connected
Oct  8 18:57:41 compute-0 systemd-udevd[85215]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:57:41 compute-0 systemd-udevd[85218]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 18:57:41 compute-0 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5569] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 18:57:41 compute-0 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5582] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5597] device (podman0): Activation: starting connection 'podman0' (a1f0edc0-82df-477b-98ca-0e9087639e8e)
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5600] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5603] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5609] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5613] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  8 18:57:41 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  8 18:57:41 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5877] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5880] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.5888] device (podman0): Activation: successful, device activated.
Oct  8 18:57:41 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct  8 18:57:41 compute-0 systemd[1]: Started libpod-conmon-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d.scope.
Oct  8 18:57:41 compute-0 systemd[1]: Started libcrun container.
Oct  8 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.887503298 +0000 UTC m=+0.519470954 container init bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.89601921 +0000 UTC m=+0.527986856 container start bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.899350047 +0000 UTC m=+0.531317703 container attach bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  8 18:57:41 compute-0 iscsid_config[85351]: iqn.1994-05.com.redhat:c69c6f2d1774#015
Oct  8 18:57:41 compute-0 systemd[1]: libpod-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d.scope: Deactivated successfully.
Oct  8 18:57:41 compute-0 conmon[85351]: conmon bd0e98d68ba1af1ef586 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d.scope/container/memory.events
Oct  8 18:57:41 compute-0 podman[85187]: 2025-10-08 18:57:41.908207299 +0000 UTC m=+0.540174935 container died bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct  8 18:57:41 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Oct  8 18:57:41 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Oct  8 18:57:41 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Oct  8 18:57:41 compute-0 NetworkManager[1035]: <info>  [1759949861.9807] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 18:57:42 compute-0 systemd[1]: run-netns-netns\x2dfc6b247d\x2deeb4\x2dfcd6\x2d87fd\x2d6cf17d892959.mount: Deactivated successfully.
Oct  8 18:57:42 compute-0 podman[85187]: 2025-10-08 18:57:42.379567873 +0000 UTC m=+1.011535509 container remove bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid_config, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 18:57:42 compute-0 python3.9[85152]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f /usr/sbin/iscsi-iname
Oct  8 18:57:42 compute-0 systemd[1]: libpod-conmon-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d.scope: Deactivated successfully.
Oct  8 18:57:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-7264b161f5c9e247605b5d6c84405c5f66ea8d06ec38ec28d9ccb5ac692914af-merged.mount: Deactivated successfully.
Oct  8 18:57:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bd0e98d68ba1af1ef58619b81edec497c6f60988687d5e2b593f6a5f203b944d-userdata-shm.mount: Deactivated successfully.
Oct  8 18:57:42 compute-0 python3.9[85152]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: #012DEPRECATED command:#012It is recommended to use Quadlets for running containers and pods under systemd.#012#012Please refer to podman-systemd.unit(5) for details.#012Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct  8 18:57:43 compute-0 python3.9[85592]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:44 compute-0 python3.9[85715]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949862.696217-119-123593211545325/.source.iscsi _original_basename=.hmug41v6 follow=False checksum=0b9689ec2ce017595dffb256feaf6209f1462b6d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:57:44.216 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 18:57:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:57:44.217 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 18:57:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:57:44.217 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 18:57:45 compute-0 python3.9[85867]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:45 compute-0 python3.9[86017]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:57:46 compute-0 podman[86143]: 2025-10-08 18:57:46.581995367 +0000 UTC m=+0.091295682 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  8 18:57:46 compute-0 python3.9[86187]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:47 compute-0 podman[86314]: 2025-10-08 18:57:47.465884509 +0000 UTC m=+0.101963430 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 18:57:47 compute-0 python3.9[86361]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:57:48 compute-0 python3.9[86520]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:48 compute-0 python3.9[86598]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:57:49 compute-0 python3.9[86750]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:50 compute-0 python3.9[86828]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:57:50 compute-0 python3.9[86980]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:51 compute-0 python3.9[87132]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:52 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  8 18:57:52 compute-0 python3.9[87210]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:53 compute-0 python3.9[87362]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:53 compute-0 python3.9[87440]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:54 compute-0 python3.9[87592]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:57:54 compute-0 systemd[1]: Reloading.
Oct  8 18:57:54 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:57:54 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:57:55 compute-0 python3.9[87780]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:56 compute-0 python3.9[87858]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:57 compute-0 python3.9[88010]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:57:57 compute-0 python3.9[88088]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:57:58 compute-0 python3.9[88240]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:57:58 compute-0 systemd[1]: Reloading.
Oct  8 18:57:58 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:57:58 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:57:58 compute-0 systemd[1]: Starting Create netns directory...
Oct  8 18:57:58 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 18:57:58 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 18:57:58 compute-0 systemd[1]: Finished Create netns directory.
Oct  8 18:57:59 compute-0 python3.9[88432]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:00 compute-0 python3.9[88584]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:00 compute-0 python3.9[88707]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949879.8423088-273-112768493057349/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:01 compute-0 python3.9[88859]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:02 compute-0 python3.9[89011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:03 compute-0 python3.9[89134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949882.2138197-298-181482929234274/.source.json _original_basename=.60b0gzvx follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:04 compute-0 python3.9[89286]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:06 compute-0 python3.9[89713]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct  8 18:58:07 compute-0 python3.9[89865]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 18:58:08 compute-0 python3.9[90017]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  8 18:58:10 compute-0 python3[90195]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 18:58:10 compute-0 podman[90232]: 2025-10-08 18:58:10.443302425 +0000 UTC m=+0.078584449 container create 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 18:58:10 compute-0 podman[90232]: 2025-10-08 18:58:10.404403497 +0000 UTC m=+0.039685561 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct  8 18:58:10 compute-0 python3[90195]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f
Oct  8 18:58:11 compute-0 python3.9[90421]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:58:12 compute-0 python3.9[90575]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:12 compute-0 python3.9[90651]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:58:13 compute-0 python3.9[90802]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759949892.8219678-386-61766793258991/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:14 compute-0 python3.9[90878]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 18:58:14 compute-0 systemd[1]: Reloading.
Oct  8 18:58:14 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:58:14 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:58:15 compute-0 python3.9[90988]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:58:15 compute-0 systemd[1]: Reloading.
Oct  8 18:58:15 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:58:15 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:58:15 compute-0 systemd[1]: Starting iscsid container...
Oct  8 18:58:15 compute-0 systemd[1]: Started libcrun container.
Oct  8 18:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/464a574f8cabd9aa5d75cf9f09985ec149dd35110e8bee7783d53abdfa52b1bf/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct  8 18:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/464a574f8cabd9aa5d75cf9f09985ec149dd35110e8bee7783d53abdfa52b1bf/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 18:58:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/464a574f8cabd9aa5d75cf9f09985ec149dd35110e8bee7783d53abdfa52b1bf/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 18:58:15 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.
Oct  8 18:58:15 compute-0 podman[91028]: 2025-10-08 18:58:15.723721289 +0000 UTC m=+0.163008194 container init 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 18:58:15 compute-0 iscsid[91044]: + sudo -E kolla_set_configs
Oct  8 18:58:15 compute-0 podman[91028]: 2025-10-08 18:58:15.754911045 +0000 UTC m=+0.194197920 container start 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid)
Oct  8 18:58:15 compute-0 podman[91028]: iscsid
Oct  8 18:58:15 compute-0 systemd[1]: Started iscsid container.
Oct  8 18:58:15 compute-0 systemd[1]: Created slice User Slice of UID 0.
Oct  8 18:58:15 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  8 18:58:15 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  8 18:58:15 compute-0 systemd[1]: Starting User Manager for UID 0...
Oct  8 18:58:15 compute-0 podman[91050]: 2025-10-08 18:58:15.873280025 +0000 UTC m=+0.096912345 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid)
Oct  8 18:58:15 compute-0 systemd[1]: 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845-211134ce0ad75163.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 18:58:15 compute-0 systemd[1]: 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845-211134ce0ad75163.service: Failed with result 'exit-code'.
Oct  8 18:58:15 compute-0 systemd[91070]: Queued start job for default target Main User Target.
Oct  8 18:58:15 compute-0 systemd[91070]: Created slice User Application Slice.
Oct  8 18:58:15 compute-0 systemd[91070]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  8 18:58:15 compute-0 systemd[91070]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 18:58:15 compute-0 systemd[91070]: Reached target Paths.
Oct  8 18:58:15 compute-0 systemd[91070]: Reached target Timers.
Oct  8 18:58:15 compute-0 systemd[91070]: Starting D-Bus User Message Bus Socket...
Oct  8 18:58:15 compute-0 systemd[91070]: Starting Create User's Volatile Files and Directories...
Oct  8 18:58:16 compute-0 systemd[91070]: Listening on D-Bus User Message Bus Socket.
Oct  8 18:58:16 compute-0 systemd[91070]: Reached target Sockets.
Oct  8 18:58:16 compute-0 systemd[91070]: Finished Create User's Volatile Files and Directories.
Oct  8 18:58:16 compute-0 systemd[91070]: Reached target Basic System.
Oct  8 18:58:16 compute-0 systemd[91070]: Reached target Main User Target.
Oct  8 18:58:16 compute-0 systemd[91070]: Startup finished in 136ms.
Oct  8 18:58:16 compute-0 systemd[1]: Started User Manager for UID 0.
Oct  8 18:58:16 compute-0 systemd[1]: Started Session c3 of User root.
Oct  8 18:58:16 compute-0 iscsid[91044]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 18:58:16 compute-0 iscsid[91044]: INFO:__main__:Validating config file
Oct  8 18:58:16 compute-0 iscsid[91044]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 18:58:16 compute-0 iscsid[91044]: INFO:__main__:Writing out command to execute
Oct  8 18:58:16 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Oct  8 18:58:16 compute-0 iscsid[91044]: ++ cat /run_command
Oct  8 18:58:16 compute-0 iscsid[91044]: + CMD='/usr/sbin/iscsid -f'
Oct  8 18:58:16 compute-0 iscsid[91044]: + ARGS=
Oct  8 18:58:16 compute-0 iscsid[91044]: + sudo kolla_copy_cacerts
Oct  8 18:58:16 compute-0 systemd[1]: Started Session c4 of User root.
Oct  8 18:58:16 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Oct  8 18:58:16 compute-0 iscsid[91044]: + [[ ! -n '' ]]
Oct  8 18:58:16 compute-0 iscsid[91044]: + . kolla_extend_start
Oct  8 18:58:16 compute-0 iscsid[91044]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct  8 18:58:16 compute-0 iscsid[91044]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct  8 18:58:16 compute-0 iscsid[91044]: + umask 0022
Oct  8 18:58:16 compute-0 iscsid[91044]: Running command: '/usr/sbin/iscsid -f'
Oct  8 18:58:16 compute-0 iscsid[91044]: + exec /usr/sbin/iscsid -f
Oct  8 18:58:16 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Oct  8 18:58:16 compute-0 python3.9[91249]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:58:17 compute-0 podman[91373]: 2025-10-08 18:58:17.11581052 +0000 UTC m=+0.107595572 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 18:58:17 compute-0 python3.9[91420]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:17 compute-0 podman[91448]: 2025-10-08 18:58:17.682904072 +0000 UTC m=+0.106446919 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 18:58:18 compute-0 python3.9[91603]: ansible-ansible.builtin.service_facts Invoked
Oct  8 18:58:18 compute-0 network[91620]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 18:58:18 compute-0 network[91621]: 'network-scripts' will be removed from distribution in near future.
Oct  8 18:58:18 compute-0 network[91622]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 18:58:23 compute-0 python3.9[91896]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  8 18:58:24 compute-0 python3.9[92048]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct  8 18:58:25 compute-0 python3.9[92204]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:25 compute-0 python3.9[92327]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949904.4636195-460-218523749388223/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:26 compute-0 systemd[1]: Stopping User Manager for UID 0...
Oct  8 18:58:26 compute-0 systemd[91070]: Activating special unit Exit the Session...
Oct  8 18:58:26 compute-0 systemd[91070]: Stopped target Main User Target.
Oct  8 18:58:26 compute-0 systemd[91070]: Stopped target Basic System.
Oct  8 18:58:26 compute-0 systemd[91070]: Stopped target Paths.
Oct  8 18:58:26 compute-0 systemd[91070]: Stopped target Sockets.
Oct  8 18:58:26 compute-0 systemd[91070]: Stopped target Timers.
Oct  8 18:58:26 compute-0 systemd[91070]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 18:58:26 compute-0 systemd[91070]: Closed D-Bus User Message Bus Socket.
Oct  8 18:58:26 compute-0 systemd[91070]: Stopped Create User's Volatile Files and Directories.
Oct  8 18:58:26 compute-0 systemd[91070]: Removed slice User Application Slice.
Oct  8 18:58:26 compute-0 systemd[91070]: Reached target Shutdown.
Oct  8 18:58:26 compute-0 systemd[91070]: Finished Exit the Session.
Oct  8 18:58:26 compute-0 systemd[91070]: Reached target Exit the Session.
Oct  8 18:58:26 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Oct  8 18:58:26 compute-0 systemd[1]: Stopped User Manager for UID 0.
Oct  8 18:58:26 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  8 18:58:26 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  8 18:58:26 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  8 18:58:26 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  8 18:58:26 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Oct  8 18:58:26 compute-0 python3.9[92480]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:27 compute-0 python3.9[92632]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 18:58:27 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  8 18:58:27 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct  8 18:58:27 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct  8 18:58:27 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct  8 18:58:27 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct  8 18:58:28 compute-0 python3.9[92788]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:29 compute-0 python3.9[92940]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:58:29 compute-0 python3.9[93092]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:58:30 compute-0 python3.9[93244]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:31 compute-0 python3.9[93367]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949910.13523-518-187506763324848/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:32 compute-0 python3.9[93519]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:58:33 compute-0 python3.9[93672]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:33 compute-0 python3.9[93824]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:34 compute-0 python3.9[93976]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:35 compute-0 python3.9[94128]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:36 compute-0 python3.9[94280]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:37 compute-0 python3.9[94432]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:37 compute-0 python3.9[94584]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:38 compute-0 python3.9[94736]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:58:39 compute-0 python3.9[94890]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:40 compute-0 python3.9[95042]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:41 compute-0 python3.9[95194]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:41 compute-0 python3.9[95272]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:42 compute-0 python3.9[95424]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:42 compute-0 python3.9[95502]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:43 compute-0 python3.9[95654]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:44 compute-0 python3.9[95806]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:58:44.217 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 18:58:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:58:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 18:58:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:58:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 18:58:44 compute-0 python3.9[95884]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:45 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct  8 18:58:45 compute-0 python3.9[96036]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:46 compute-0 python3.9[96115]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:46 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  8 18:58:46 compute-0 podman[96218]: 2025-10-08 18:58:46.679144403 +0000 UTC m=+0.092328102 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 18:58:47 compute-0 python3.9[96290]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:58:47 compute-0 systemd[1]: Reloading.
Oct  8 18:58:47 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:58:47 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:58:47 compute-0 podman[96326]: 2025-10-08 18:58:47.486980481 +0000 UTC m=+0.083672365 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 18:58:47 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Oct  8 18:58:48 compute-0 podman[96471]: 2025-10-08 18:58:48.140693641 +0000 UTC m=+0.175220885 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller)
Oct  8 18:58:48 compute-0 python3.9[96521]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:48 compute-0 python3.9[96604]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:49 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct  8 18:58:49 compute-0 python3.9[96757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:50 compute-0 python3.9[96835]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:51 compute-0 python3.9[96987]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:58:51 compute-0 systemd[1]: Reloading.
Oct  8 18:58:51 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:58:51 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:58:51 compute-0 systemd[1]: Starting Create netns directory...
Oct  8 18:58:51 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  8 18:58:51 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  8 18:58:51 compute-0 systemd[1]: Finished Create netns directory.
Oct  8 18:58:52 compute-0 python3.9[97181]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:53 compute-0 python3.9[97333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:54 compute-0 python3.9[97456]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759949932.73868-725-31010283865741/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:55 compute-0 python3.9[97608]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 18:58:55 compute-0 python3.9[97760]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:58:56 compute-0 python3.9[97883]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949935.3053358-750-226047257646264/.source.json _original_basename=.13ncsuws follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:57 compute-0 python3.9[98035]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:58:59 compute-0 python3.9[98462]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct  8 18:59:00 compute-0 python3.9[98614]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 18:59:01 compute-0 python3.9[98766]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  8 18:59:02 compute-0 python3[98944]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 18:59:02 compute-0 podman[98979]: 2025-10-08 18:59:02.930685017 +0000 UTC m=+0.054971480 container create 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 18:59:02 compute-0 podman[98979]: 2025-10-08 18:59:02.902881379 +0000 UTC m=+0.027167832 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct  8 18:59:02 compute-0 python3[98944]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43
Oct  8 18:59:03 compute-0 python3.9[99170]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:59:04 compute-0 python3.9[99324]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:05 compute-0 python3.9[99400]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:59:05 compute-0 python3.9[99551]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759949945.1748476-838-195206719469371/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:06 compute-0 python3.9[99627]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 18:59:06 compute-0 systemd[1]: Reloading.
Oct  8 18:59:06 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:59:06 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:59:07 compute-0 python3.9[99737]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:07 compute-0 systemd[1]: Reloading.
Oct  8 18:59:07 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:59:07 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:59:07 compute-0 systemd[1]: Starting multipathd container...
Oct  8 18:59:08 compute-0 systemd[1]: Started libcrun container.
Oct  8 18:59:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  8 18:59:08 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 18:59:08 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.
Oct  8 18:59:08 compute-0 podman[99777]: 2025-10-08 18:59:08.083252179 +0000 UTC m=+0.129413080 container init 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Oct  8 18:59:08 compute-0 multipathd[99792]: + sudo -E kolla_set_configs
Oct  8 18:59:08 compute-0 podman[99777]: 2025-10-08 18:59:08.108402192 +0000 UTC m=+0.154563063 container start 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true)
Oct  8 18:59:08 compute-0 podman[99777]: multipathd
Oct  8 18:59:08 compute-0 systemd[1]: Started multipathd container.
Oct  8 18:59:08 compute-0 multipathd[99792]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 18:59:08 compute-0 multipathd[99792]: INFO:__main__:Validating config file
Oct  8 18:59:08 compute-0 multipathd[99792]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 18:59:08 compute-0 multipathd[99792]: INFO:__main__:Writing out command to execute
Oct  8 18:59:08 compute-0 multipathd[99792]: ++ cat /run_command
Oct  8 18:59:08 compute-0 multipathd[99792]: + CMD='/usr/sbin/multipathd -d'
Oct  8 18:59:08 compute-0 multipathd[99792]: + ARGS=
Oct  8 18:59:08 compute-0 multipathd[99792]: + sudo kolla_copy_cacerts
Oct  8 18:59:08 compute-0 podman[99798]: 2025-10-08 18:59:08.231616894 +0000 UTC m=+0.100368295 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 18:59:08 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-395fe86eb4747b73.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 18:59:08 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-395fe86eb4747b73.service: Failed with result 'exit-code'.
Oct  8 18:59:08 compute-0 multipathd[99792]: + [[ ! -n '' ]]
Oct  8 18:59:08 compute-0 multipathd[99792]: + . kolla_extend_start
Oct  8 18:59:08 compute-0 multipathd[99792]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  8 18:59:08 compute-0 multipathd[99792]: Running command: '/usr/sbin/multipathd -d'
Oct  8 18:59:08 compute-0 multipathd[99792]: + umask 0022
Oct  8 18:59:08 compute-0 multipathd[99792]: + exec /usr/sbin/multipathd -d
Oct  8 18:59:08 compute-0 multipathd[99792]: 618.898435 | --------start up--------
Oct  8 18:59:08 compute-0 multipathd[99792]: 618.898457 | read /etc/multipath.conf
Oct  8 18:59:08 compute-0 multipathd[99792]: 618.907779 | path checkers start up
Oct  8 18:59:08 compute-0 python3.9[99981]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 18:59:09 compute-0 python3.9[100135]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:59:10 compute-0 python3.9[100300]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 18:59:10 compute-0 systemd[1]: Stopping multipathd container...
Oct  8 18:59:10 compute-0 multipathd[99792]: 621.415308 | exit (signal)
Oct  8 18:59:10 compute-0 multipathd[99792]: 621.415372 | --------shut down-------
Oct  8 18:59:10 compute-0 systemd[1]: libpod-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope: Deactivated successfully.
Oct  8 18:59:10 compute-0 podman[100304]: 2025-10-08 18:59:10.817356538 +0000 UTC m=+0.073714830 container died 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  8 18:59:10 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-395fe86eb4747b73.timer: Deactivated successfully.
Oct  8 18:59:10 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.
Oct  8 18:59:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-userdata-shm.mount: Deactivated successfully.
Oct  8 18:59:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca-merged.mount: Deactivated successfully.
Oct  8 18:59:10 compute-0 podman[100304]: 2025-10-08 18:59:10.864953538 +0000 UTC m=+0.121311830 container cleanup 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Oct  8 18:59:10 compute-0 podman[100304]: multipathd
Oct  8 18:59:10 compute-0 podman[100335]: multipathd
Oct  8 18:59:10 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct  8 18:59:10 compute-0 systemd[1]: Stopped multipathd container.
Oct  8 18:59:10 compute-0 systemd[1]: Starting multipathd container...
Oct  8 18:59:11 compute-0 systemd[1]: Started libcrun container.
Oct  8 18:59:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  8 18:59:11 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/517be04000c2e5d64e270c8a61cdaeaa39cf403d978930fe8216e7358faa20ca/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 18:59:11 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.
Oct  8 18:59:11 compute-0 podman[100348]: 2025-10-08 18:59:11.08019783 +0000 UTC m=+0.139724682 container init 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 18:59:11 compute-0 multipathd[100364]: + sudo -E kolla_set_configs
Oct  8 18:59:11 compute-0 podman[100348]: 2025-10-08 18:59:11.100987849 +0000 UTC m=+0.160514381 container start 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Oct  8 18:59:11 compute-0 podman[100348]: multipathd
Oct  8 18:59:11 compute-0 systemd[1]: Started multipathd container.
Oct  8 18:59:11 compute-0 multipathd[100364]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 18:59:11 compute-0 multipathd[100364]: INFO:__main__:Validating config file
Oct  8 18:59:11 compute-0 multipathd[100364]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 18:59:11 compute-0 multipathd[100364]: INFO:__main__:Writing out command to execute
Oct  8 18:59:11 compute-0 multipathd[100364]: ++ cat /run_command
Oct  8 18:59:11 compute-0 multipathd[100364]: + CMD='/usr/sbin/multipathd -d'
Oct  8 18:59:11 compute-0 multipathd[100364]: + ARGS=
Oct  8 18:59:11 compute-0 multipathd[100364]: + sudo kolla_copy_cacerts
Oct  8 18:59:11 compute-0 podman[100371]: 2025-10-08 18:59:11.173932597 +0000 UTC m=+0.063465890 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Oct  8 18:59:11 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-6de5b76987abf0f5.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 18:59:11 compute-0 systemd[1]: 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d-6de5b76987abf0f5.service: Failed with result 'exit-code'.
Oct  8 18:59:11 compute-0 multipathd[100364]: + [[ ! -n '' ]]
Oct  8 18:59:11 compute-0 multipathd[100364]: + . kolla_extend_start
Oct  8 18:59:11 compute-0 multipathd[100364]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  8 18:59:11 compute-0 multipathd[100364]: Running command: '/usr/sbin/multipathd -d'
Oct  8 18:59:11 compute-0 multipathd[100364]: + umask 0022
Oct  8 18:59:11 compute-0 multipathd[100364]: + exec /usr/sbin/multipathd -d
Oct  8 18:59:11 compute-0 multipathd[100364]: 621.827410 | --------start up--------
Oct  8 18:59:11 compute-0 multipathd[100364]: 621.827429 | read /etc/multipath.conf
Oct  8 18:59:11 compute-0 multipathd[100364]: 621.832177 | path checkers start up
Oct  8 18:59:11 compute-0 python3.9[100552]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:12 compute-0 python3.9[100704]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  8 18:59:13 compute-0 python3.9[100856]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct  8 18:59:13 compute-0 kernel: Key type psk registered
Oct  8 18:59:14 compute-0 python3.9[101019]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 18:59:15 compute-0 python3.9[101142]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759949953.808738-918-15789161818488/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:15 compute-0 python3.9[101294]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:16 compute-0 python3.9[101446]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 18:59:16 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  8 18:59:16 compute-0 systemd[1]: Stopped Load Kernel Modules.
Oct  8 18:59:16 compute-0 systemd[1]: Stopping Load Kernel Modules...
Oct  8 18:59:16 compute-0 systemd[1]: Starting Load Kernel Modules...
Oct  8 18:59:16 compute-0 systemd[1]: Finished Load Kernel Modules.
Oct  8 18:59:16 compute-0 podman[101448]: 2025-10-08 18:59:16.929166222 +0000 UTC m=+0.098271087 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS)
Oct  8 18:59:17 compute-0 podman[101620]: 2025-10-08 18:59:17.637956175 +0000 UTC m=+0.077767795 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 18:59:17 compute-0 python3.9[101621]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  8 18:59:18 compute-0 podman[101688]: 2025-10-08 18:59:18.69705536 +0000 UTC m=+0.107542069 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 18:59:18 compute-0 python3.9[101743]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  8 18:59:25 compute-0 systemd[1]: Reloading.
Oct  8 18:59:25 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:59:25 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:59:25 compute-0 systemd[1]: Reloading.
Oct  8 18:59:25 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:59:25 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:59:25 compute-0 systemd-logind[844]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  8 18:59:25 compute-0 systemd-logind[844]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  8 18:59:26 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  8 18:59:26 compute-0 systemd[1]: Starting man-db-cache-update.service...
Oct  8 18:59:26 compute-0 systemd[1]: Reloading.
Oct  8 18:59:26 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:59:26 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:59:26 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  8 18:59:27 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  8 18:59:27 compute-0 systemd[1]: Finished man-db-cache-update.service.
Oct  8 18:59:27 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.569s CPU time.
Oct  8 18:59:27 compute-0 systemd[1]: run-rba63fb3783dd4aea876b6703da0c6061.service: Deactivated successfully.
Oct  8 18:59:27 compute-0 python3.9[103198]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:28 compute-0 python3.9[103348]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 18:59:29 compute-0 python3.9[103504]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:31 compute-0 python3.9[103656]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 18:59:31 compute-0 systemd[1]: Reloading.
Oct  8 18:59:31 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 18:59:31 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 18:59:32 compute-0 python3.9[103842]: ansible-ansible.builtin.service_facts Invoked
Oct  8 18:59:32 compute-0 network[103859]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 18:59:32 compute-0 network[103860]: 'network-scripts' will be removed from distribution in near future.
Oct  8 18:59:32 compute-0 network[103861]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 18:59:37 compute-0 python3.9[104138]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:37 compute-0 python3.9[104291]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:39 compute-0 python3.9[104444]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:41 compute-0 podman[104569]: 2025-10-08 18:59:41.370781193 +0000 UTC m=+0.064293784 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct  8 18:59:41 compute-0 python3.9[104616]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:42 compute-0 python3.9[104769]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:43 compute-0 python3.9[104922]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:59:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 18:59:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:59:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 18:59:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 18:59:44.219 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 18:59:45 compute-0 python3.9[105075]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:45 compute-0 python3.9[105228]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 18:59:46 compute-0 python3.9[105381]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:47 compute-0 podman[105505]: 2025-10-08 18:59:47.203473206 +0000 UTC m=+0.068956066 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 18:59:47 compute-0 python3.9[105551]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:47 compute-0 podman[105675]: 2025-10-08 18:59:47.892222541 +0000 UTC m=+0.069531302 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Oct  8 18:59:48 compute-0 python3.9[105723]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:48 compute-0 python3.9[105875]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:49 compute-0 podman[105999]: 2025-10-08 18:59:49.352766847 +0000 UTC m=+0.150364334 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 18:59:49 compute-0 python3.9[106049]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:50 compute-0 python3.9[106208]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:50 compute-0 python3.9[106360]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:51 compute-0 python3.9[106512]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:52 compute-0 python3.9[106664]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:53 compute-0 python3.9[106816]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:53 compute-0 python3.9[106968]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:54 compute-0 python3.9[107120]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:55 compute-0 python3.9[107272]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:56 compute-0 python3.9[107424]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:56 compute-0 python3.9[107576]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:57 compute-0 python3.9[107728]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 18:59:58 compute-0 python3.9[107880]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 18:59:59 compute-0 python3.9[108032]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 19:00:00 compute-0 python3.9[108184]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 19:00:00 compute-0 systemd[1]: Starting system activity accounting tool...
Oct  8 19:00:00 compute-0 systemd[1]: Reloading.
Oct  8 19:00:00 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:00:00 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:00:00 compute-0 systemd[1]: sysstat-collect.service: Deactivated successfully.
Oct  8 19:00:00 compute-0 systemd[1]: Finished system activity accounting tool.
Oct  8 19:00:01 compute-0 python3.9[108373]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:00:02 compute-0 python3.9[108526]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:00:02 compute-0 python3.9[108679]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:00:03 compute-0 python3.9[108832]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:00:04 compute-0 python3.9[108985]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:00:05 compute-0 python3.9[109138]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:00:05 compute-0 python3.9[109291]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:00:06 compute-0 python3.9[109444]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:00:08 compute-0 python3.9[109597]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:08 compute-0 python3.9[109749]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:09 compute-0 python3.9[109901]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:10 compute-0 python3.9[110053]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:11 compute-0 python3.9[110205]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:11 compute-0 podman[110253]: 2025-10-08 19:00:11.706285812 +0000 UTC m=+0.110991354 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 19:00:12 compute-0 python3.9[110374]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:12 compute-0 python3.9[110526]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:13 compute-0 python3.9[110678]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:14 compute-0 python3.9[110830]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:15 compute-0 python3.9[110983]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:16 compute-0 python3.9[111135]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:16 compute-0 python3.9[111287]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:17 compute-0 podman[111312]: 2025-10-08 19:00:17.666307067 +0000 UTC m=+0.082398001 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct  8 19:00:18 compute-0 podman[111332]: 2025-10-08 19:00:18.65623814 +0000 UTC m=+0.074176834 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 19:00:19 compute-0 podman[111352]: 2025-10-08 19:00:19.706468348 +0000 UTC m=+0.134181661 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  8 19:00:21 compute-0 python3.9[111506]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct  8 19:00:22 compute-0 python3.9[111659]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 19:00:23 compute-0 python3.9[111817]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 19:00:24 compute-0 systemd-logind[844]: New session 10 of user zuul.
Oct  8 19:00:24 compute-0 systemd[1]: Started Session 10 of User zuul.
Oct  8 19:00:24 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Oct  8 19:00:24 compute-0 systemd-logind[844]: Session 10 logged out. Waiting for processes to exit.
Oct  8 19:00:24 compute-0 systemd-logind[844]: Removed session 10.
Oct  8 19:00:25 compute-0 python3.9[112003]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:00:26 compute-0 python3.9[112124]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950025.0368984-1535-261634882125252/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:27 compute-0 python3.9[112274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:00:27 compute-0 python3.9[112350]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:28 compute-0 python3.9[112500]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:00:28 compute-0 python3.9[112621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950027.6656327-1535-207035824011674/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:29 compute-0 python3.9[112771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:00:30 compute-0 python3.9[112894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950028.9868026-1535-15403618273586/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:30 compute-0 python3.9[113045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:00:31 compute-0 python3.9[113166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950030.289836-1535-192588183002005/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:32 compute-0 python3.9[113318]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:00:32 compute-0 python3.9[113470]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:00:33 compute-0 python3.9[113622]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:00:34 compute-0 python3.9[113774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:00:35 compute-0 python3.9[113897]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759950033.9499967-1628-253820750042259/.source _original_basename=.0k0akyhg follow=False checksum=08a0b89788bacf538e34b1b4a6eecea1c595c768 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct  8 19:00:35 compute-0 python3.9[114049]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:00:36 compute-0 python3.9[114201]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:00:37 compute-0 python3.9[114322]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950036.2191405-1654-23521037203750/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=837ffd9c004e5987a2e117698c56827ebbfeb5b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:38 compute-0 python3.9[114472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:00:38 compute-0 python3.9[114593]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950037.6038756-1669-204846307884455/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=722ab36345f3375cbdcf911ce8f6e1a8083d7e59 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:00:39 compute-0 python3.9[114745]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct  8 19:00:40 compute-0 python3.9[114897]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 19:00:41 compute-0 python3[115049]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 19:00:41 compute-0 podman[115086]: 2025-10-08 19:00:41.720000524 +0000 UTC m=+0.071898660 container create a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, container_name=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm)
Oct  8 19:00:41 compute-0 podman[115086]: 2025-10-08 19:00:41.678222122 +0000 UTC m=+0.030120308 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct  8 19:00:41 compute-0 python3[115049]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct  8 19:00:42 compute-0 podman[115248]: 2025-10-08 19:00:42.474281179 +0000 UTC m=+0.088208858 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:00:42 compute-0 python3.9[115297]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:00:43 compute-0 python3.9[115451]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct  8 19:00:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:00:44.220 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:00:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:00:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:00:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:00:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:00:44 compute-0 python3.9[115603]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 19:00:45 compute-0 python3[115755]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 19:00:45 compute-0 podman[115793]: 2025-10-08 19:00:45.571118381 +0000 UTC m=+0.062236861 container create e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, container_name=nova_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:00:45 compute-0 podman[115793]: 2025-10-08 19:00:45.532721687 +0000 UTC m=+0.023840187 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844
Oct  8 19:00:45 compute-0 python3[115755]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844 kolla_start
Oct  8 19:00:46 compute-0 python3.9[115983]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:00:47 compute-0 python3.9[116139]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:00:48 compute-0 podman[116263]: 2025-10-08 19:00:48.108741261 +0000 UTC m=+0.070666154 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 19:00:48 compute-0 python3.9[116309]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950047.6049562-1761-157117261782955/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:00:48 compute-0 python3.9[116385]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 19:00:48 compute-0 systemd[1]: Reloading.
Oct  8 19:00:49 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:00:49 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:00:49 compute-0 podman[116387]: 2025-10-08 19:00:49.025850118 +0000 UTC m=+0.094642803 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 19:00:49 compute-0 python3.9[116515]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 19:00:49 compute-0 systemd[1]: Reloading.
Oct  8 19:00:49 compute-0 podman[116517]: 2025-10-08 19:00:49.96216491 +0000 UTC m=+0.104921609 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:00:50 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:00:50 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:00:50 compute-0 systemd[1]: Starting nova_compute container...
Oct  8 19:00:50 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:50 compute-0 podman[116584]: 2025-10-08 19:00:50.408354283 +0000 UTC m=+0.175391806 container init e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible)
Oct  8 19:00:50 compute-0 podman[116584]: 2025-10-08 19:00:50.414709856 +0000 UTC m=+0.181747359 container start e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Oct  8 19:00:50 compute-0 podman[116584]: nova_compute
Oct  8 19:00:50 compute-0 systemd[1]: Started nova_compute container.
Oct  8 19:00:50 compute-0 nova_compute[116600]: + sudo -E kolla_set_configs
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Validating config file
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying service configuration files
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Deleting /etc/ceph
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Creating directory /etc/ceph
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /etc/ceph
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Writing out command to execute
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  8 19:00:50 compute-0 nova_compute[116600]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  8 19:00:50 compute-0 nova_compute[116600]: ++ cat /run_command
Oct  8 19:00:50 compute-0 nova_compute[116600]: + CMD=nova-compute
Oct  8 19:00:50 compute-0 nova_compute[116600]: + ARGS=
Oct  8 19:00:50 compute-0 nova_compute[116600]: + sudo kolla_copy_cacerts
Oct  8 19:00:50 compute-0 nova_compute[116600]: + [[ ! -n '' ]]
Oct  8 19:00:50 compute-0 nova_compute[116600]: + . kolla_extend_start
Oct  8 19:00:50 compute-0 nova_compute[116600]: Running command: 'nova-compute'
Oct  8 19:00:50 compute-0 nova_compute[116600]: + echo 'Running command: '\''nova-compute'\'''
Oct  8 19:00:50 compute-0 nova_compute[116600]: + umask 0022
Oct  8 19:00:50 compute-0 nova_compute[116600]: + exec nova-compute
Oct  8 19:00:51 compute-0 python3.9[116761]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:00:52 compute-0 python3.9[116912]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:00:52 compute-0 python3.9[117063]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.369 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.369 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.369 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.369 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  8 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.580 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:00:53 compute-0 nova_compute[116600]: 2025-10-08 19:00:53.617 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:00:53 compute-0 python3.9[117217]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.187 2 INFO nova.virt.driver [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  8 19:00:54 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 19:00:54 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 19:00:54 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.498 2 INFO nova.compute.provider_config [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.513 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.514 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.514 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.514 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.515 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.516 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.517 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.518 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.519 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.520 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.521 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.522 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.523 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.524 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.525 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.526 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.527 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.528 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.529 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.530 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.531 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.532 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.533 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.534 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.535 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.536 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.537 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.538 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.539 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.540 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.541 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.542 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.543 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.544 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.545 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.546 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.547 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.548 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.548 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.548 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.548 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.549 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.550 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.551 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.552 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.553 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.554 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.555 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.556 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.557 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.558 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.559 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.560 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.561 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.562 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.563 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.564 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.565 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.566 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.567 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.568 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.569 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.570 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.571 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.572 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.573 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.574 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.575 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.576 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.577 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.578 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.579 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.580 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.581 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.582 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.583 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.583 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.583 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.583 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.584 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.585 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.586 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.587 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.588 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.589 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.590 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.591 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.592 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.593 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.594 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.595 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.596 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.597 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.598 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.599 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.600 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.601 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.602 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.603 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.604 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.605 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.605 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.605 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.605 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.606 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.607 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.608 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.609 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.610 2 WARNING oslo_config.cfg [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  8 19:00:54 compute-0 nova_compute[116600]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  8 19:00:54 compute-0 nova_compute[116600]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  8 19:00:54 compute-0 nova_compute[116600]: and ``live_migration_inbound_addr`` respectively.
Oct  8 19:00:54 compute-0 nova_compute[116600]: ).  Its value may be silently ignored in the future.#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.610 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.610 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.610 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.611 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.611 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.611 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.611 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.612 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.613 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.613 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.613 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.613 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.614 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.615 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.616 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.617 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.618 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.619 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.620 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.621 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.622 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.623 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.624 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.625 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.625 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.625 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.625 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.626 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.627 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.628 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.629 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.630 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.631 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.632 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.633 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.634 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.635 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.635 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.635 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.635 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.636 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.637 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.637 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.637 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.637 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.638 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.638 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.638 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.638 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.639 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.640 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.641 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.642 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.643 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.644 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.645 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.646 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.646 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.646 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.646 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.647 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.648 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.649 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.650 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.650 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.650 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.650 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.651 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.652 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.653 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.654 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.655 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.656 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.657 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.658 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.659 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.660 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.661 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.661 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.661 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.661 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.662 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.663 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.664 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.665 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.666 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.667 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.668 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.669 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.669 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.669 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.669 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.670 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.671 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.672 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.673 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.674 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.675 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.676 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.677 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.678 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.679 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.680 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.681 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.682 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.683 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.684 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.685 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.686 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.687 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.688 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.689 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.690 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.691 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.692 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.693 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.694 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.695 2 DEBUG oslo_service.service [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  8 19:00:54 compute-0 python3.9[117392]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.701 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.714 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.714 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.715 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.715 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  8 19:00:54 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Oct  8 19:00:54 compute-0 systemd[1]: Stopping nova_compute container...
Oct  8 19:00:54 compute-0 systemd[1]: Started libvirt QEMU daemon.
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.779 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fee62472be0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.781 2 DEBUG nova.virt.libvirt.host [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fee62472be0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.781 2 INFO nova.virt.libvirt.driver [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.807 2 WARNING nova.virt.libvirt.driver [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.807 2 DEBUG nova.virt.libvirt.volume.mount [None req-8d283617-3907-4613-a076-7af437b7681f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.811 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.811 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:00:54 compute-0 nova_compute[116600]: 2025-10-08 19:00:54.811 2 DEBUG oslo_concurrency.lockutils [None req-d1827e20-46a6-48aa-b4f1-63dad6a984ca - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:00:56 compute-0 virtqemud[117415]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct  8 19:00:56 compute-0 virtqemud[117415]: hostname: compute-0
Oct  8 19:00:56 compute-0 virtqemud[117415]: End of file while reading data: Input/output error
Oct  8 19:00:56 compute-0 systemd[1]: libpod-e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf.scope: Deactivated successfully.
Oct  8 19:00:56 compute-0 systemd[1]: libpod-e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf.scope: Consumed 3.306s CPU time.
Oct  8 19:00:56 compute-0 podman[117418]: 2025-10-08 19:00:56.086599994 +0000 UTC m=+1.331394185 container died e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Oct  8 19:00:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf-userdata-shm.mount: Deactivated successfully.
Oct  8 19:00:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d-merged.mount: Deactivated successfully.
Oct  8 19:00:56 compute-0 podman[117418]: 2025-10-08 19:00:56.257189551 +0000 UTC m=+1.501983742 container cleanup e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm)
Oct  8 19:00:56 compute-0 podman[117418]: nova_compute
Oct  8 19:00:56 compute-0 podman[117486]: nova_compute
Oct  8 19:00:56 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct  8 19:00:56 compute-0 systemd[1]: Stopped nova_compute container.
Oct  8 19:00:56 compute-0 systemd[1]: Starting nova_compute container...
Oct  8 19:00:56 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7ea2da9d416b58111d04fa77b29f528729da1bcd33978a81b2a5bf8cb63050d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:56 compute-0 podman[117499]: 2025-10-08 19:00:56.455018661 +0000 UTC m=+0.100517212 container init e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  8 19:00:56 compute-0 podman[117499]: 2025-10-08 19:00:56.462271209 +0000 UTC m=+0.107769680 container start e51f777915e1b029705fc377b3a37e2f0df1eae4dc423f4a6608ea43f1cd0baf (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:00:56 compute-0 podman[117499]: nova_compute
Oct  8 19:00:56 compute-0 nova_compute[117514]: + sudo -E kolla_set_configs
Oct  8 19:00:56 compute-0 systemd[1]: Started nova_compute container.
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Validating config file
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying service configuration files
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /etc/ceph
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Creating directory /etc/ceph
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /etc/ceph
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Writing out command to execute
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  8 19:00:56 compute-0 nova_compute[117514]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  8 19:00:56 compute-0 nova_compute[117514]: ++ cat /run_command
Oct  8 19:00:56 compute-0 nova_compute[117514]: + CMD=nova-compute
Oct  8 19:00:56 compute-0 nova_compute[117514]: + ARGS=
Oct  8 19:00:56 compute-0 nova_compute[117514]: + sudo kolla_copy_cacerts
Oct  8 19:00:56 compute-0 nova_compute[117514]: + [[ ! -n '' ]]
Oct  8 19:00:56 compute-0 nova_compute[117514]: + . kolla_extend_start
Oct  8 19:00:56 compute-0 nova_compute[117514]: + echo 'Running command: '\''nova-compute'\'''
Oct  8 19:00:56 compute-0 nova_compute[117514]: Running command: 'nova-compute'
Oct  8 19:00:56 compute-0 nova_compute[117514]: + umask 0022
Oct  8 19:00:56 compute-0 nova_compute[117514]: + exec nova-compute
Oct  8 19:00:57 compute-0 python3.9[117677]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  8 19:00:57 compute-0 systemd[1]: Started libpod-conmon-a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133.scope.
Oct  8 19:00:57 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:00:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299922c2d269b58288dcf595ac237e859afe8756ba652180b1124ae293d6c96c/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299922c2d269b58288dcf595ac237e859afe8756ba652180b1124ae293d6c96c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:57 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/299922c2d269b58288dcf595ac237e859afe8756ba652180b1124ae293d6c96c/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct  8 19:00:57 compute-0 podman[117703]: 2025-10-08 19:00:57.644434761 +0000 UTC m=+0.119530709 container init a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, org.label-schema.build-date=20251001)
Oct  8 19:00:57 compute-0 podman[117703]: 2025-10-08 19:00:57.652089321 +0000 UTC m=+0.127185269 container start a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:00:57 compute-0 python3.9[117677]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Applying nova statedir ownership
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct  8 19:00:57 compute-0 nova_compute_init[117725]: INFO:nova_statedir:Nova statedir ownership complete
Oct  8 19:00:57 compute-0 systemd[1]: libpod-a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133.scope: Deactivated successfully.
Oct  8 19:00:57 compute-0 podman[117726]: 2025-10-08 19:00:57.725130942 +0000 UTC m=+0.042983507 container died a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.build-date=20251001)
Oct  8 19:00:57 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133-userdata-shm.mount: Deactivated successfully.
Oct  8 19:00:57 compute-0 systemd[1]: var-lib-containers-storage-overlay-299922c2d269b58288dcf595ac237e859afe8756ba652180b1124ae293d6c96c-merged.mount: Deactivated successfully.
Oct  8 19:00:57 compute-0 podman[117739]: 2025-10-08 19:00:57.807257475 +0000 UTC m=+0.072261170 container cleanup a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:5f179b847f2dc32d9110b8f2be9fe65f1aeada1e18105dffdaf052981215d844', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Oct  8 19:00:57 compute-0 systemd[1]: libpod-conmon-a5a596648e1d0ab8a84f48da9bc406d3b2c789976c4340d1386b8a7ace66d133.scope: Deactivated successfully.
Oct  8 19:00:58 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Oct  8 19:00:58 compute-0 systemd[1]: session-8.scope: Consumed 2min 39.539s CPU time.
Oct  8 19:00:58 compute-0 systemd-logind[844]: Session 8 logged out. Waiting for processes to exit.
Oct  8 19:00:58 compute-0 systemd-logind[844]: Removed session 8.
Oct  8 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.553 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.554 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.554 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  8 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.554 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  8 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.678 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:00:58 compute-0 nova_compute[117514]: 2025-10-08 19:00:58.707 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.230 2 INFO nova.virt.driver [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.373 2 INFO nova.compute.provider_config [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.397 2 DEBUG oslo_concurrency.lockutils [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.397 2 DEBUG oslo_concurrency.lockutils [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.397 2 DEBUG oslo_concurrency.lockutils [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.398 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.398 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.398 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.398 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.399 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.400 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.400 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.400 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.400 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.401 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.401 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.401 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.401 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.402 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.402 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.402 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.402 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.403 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.403 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.403 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.403 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.404 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.404 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.404 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.404 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.405 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.405 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.405 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.405 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.406 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.406 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.406 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.407 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.407 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.407 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.407 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.408 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.409 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.410 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.411 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.412 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.413 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.414 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.415 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.415 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.415 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.415 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.416 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.417 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.418 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.418 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.418 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.418 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.419 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.420 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.421 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.422 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.423 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.424 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.425 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.426 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.427 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.428 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.429 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.430 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.431 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.432 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.433 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.433 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.433 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.433 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.434 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.435 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.436 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.437 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.438 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.439 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.440 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.441 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.442 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.443 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.444 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.445 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.446 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.447 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.448 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.449 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.450 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.451 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.452 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.453 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.454 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.455 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.456 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.457 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.458 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.459 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.460 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.461 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.462 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.463 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.464 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.465 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.466 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.467 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.468 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.469 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.470 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.471 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.472 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.473 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.474 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.475 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.476 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.477 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.478 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 WARNING oslo_config.cfg [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  8 19:00:59 compute-0 nova_compute[117514]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  8 19:00:59 compute-0 nova_compute[117514]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  8 19:00:59 compute-0 nova_compute[117514]: and ``live_migration_inbound_addr`` respectively.
Oct  8 19:00:59 compute-0 nova_compute[117514]: ).  Its value may be silently ignored in the future.#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.479 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.480 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.481 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.482 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.483 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.484 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.485 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.486 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.487 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.488 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.489 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.490 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.491 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.492 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.493 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.494 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.495 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.496 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.497 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.498 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.499 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.500 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.501 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.502 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.503 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.504 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.505 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.506 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.507 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.508 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.509 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.510 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.511 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.512 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.513 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.514 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.515 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.516 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.517 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.518 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.519 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.520 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.521 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.522 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.523 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.524 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.525 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.526 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.527 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.528 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.529 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.530 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.531 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.532 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.533 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.534 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.535 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.536 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.537 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.538 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.539 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.540 2 DEBUG oslo_service.service [None req-fffc0769-5d70-4e7d-b57f-d001c6037a93 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.541 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.554 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.555 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.555 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.555 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.568 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fe40d5c63d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.570 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fe40d5c63d0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.571 2 INFO nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.579 2 INFO nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host capabilities <capabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]: 
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <host>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <uuid>9ff32318-d7e0-4b37-bb6e-ea4cfd795672</uuid>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <arch>x86_64</arch>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model>EPYC-Rome-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <vendor>AMD</vendor>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <microcode version='16777317'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <signature family='23' model='49' stepping='0'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='x2apic'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='tsc-deadline'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='osxsave'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='hypervisor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='tsc_adjust'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='spec-ctrl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='stibp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='arch-capabilities'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='cmp_legacy'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='topoext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='virt-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='lbrv'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='tsc-scale'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='vmcb-clean'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='pause-filter'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='pfthreshold'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='svme-addr-chk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='rdctl-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='skip-l1dfl-vmentry'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='mds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature name='pschange-mc-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <pages unit='KiB' size='4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <pages unit='KiB' size='2048'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <pages unit='KiB' size='1048576'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <power_management>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <suspend_mem/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <suspend_disk/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <suspend_hybrid/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </power_management>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <iommu support='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <migration_features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <live/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <uri_transports>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <uri_transport>tcp</uri_transport>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <uri_transport>rdma</uri_transport>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </uri_transports>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </migration_features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <topology>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <cells num='1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <cell id='0'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:          <memory unit='KiB'>7864104</memory>
Oct  8 19:00:59 compute-0 nova_compute[117514]:          <pages unit='KiB' size='4'>1966026</pages>
Oct  8 19:00:59 compute-0 nova_compute[117514]:          <pages unit='KiB' size='2048'>0</pages>
Oct  8 19:00:59 compute-0 nova_compute[117514]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  8 19:00:59 compute-0 nova_compute[117514]:          <distances>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <sibling id='0' value='10'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:          </distances>
Oct  8 19:00:59 compute-0 nova_compute[117514]:          <cpus num='8'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:          </cpus>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        </cell>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </cells>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </topology>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <cache>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </cache>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <secmodel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model>selinux</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <doi>0</doi>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </secmodel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <secmodel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model>dac</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <doi>0</doi>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </secmodel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </host>
Oct  8 19:00:59 compute-0 nova_compute[117514]: 
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <guest>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <os_type>hvm</os_type>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <arch name='i686'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <wordsize>32</wordsize>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <domain type='qemu'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <domain type='kvm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </arch>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <pae/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <nonpae/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <acpi default='on' toggle='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <apic default='on' toggle='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <cpuselection/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <deviceboot/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <disksnapshot default='on' toggle='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <externalSnapshot/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </guest>
Oct  8 19:00:59 compute-0 nova_compute[117514]: 
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <guest>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <os_type>hvm</os_type>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <arch name='x86_64'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <wordsize>64</wordsize>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <domain type='qemu'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <domain type='kvm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </arch>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <acpi default='on' toggle='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <apic default='on' toggle='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <cpuselection/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <deviceboot/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <disksnapshot default='on' toggle='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <externalSnapshot/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </guest>
Oct  8 19:00:59 compute-0 nova_compute[117514]: 
Oct  8 19:00:59 compute-0 nova_compute[117514]: </capabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]: #033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.584 2 WARNING nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.584 2 DEBUG nova.virt.libvirt.volume.mount [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.586 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.610 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  8 19:00:59 compute-0 nova_compute[117514]: <domainCapabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <domain>kvm</domain>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <arch>i686</arch>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <vcpu max='4096'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <iothreads supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <os supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <enum name='firmware'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <loader supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>rom</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pflash</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='readonly'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>yes</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>no</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='secure'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>no</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </loader>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </os>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='host-passthrough' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='hostPassthroughMigratable'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>on</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>off</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='maximum' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='maximumMigratable'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>on</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>off</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='host-model' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <vendor>AMD</vendor>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='x2apic'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='hypervisor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='stibp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='overflow-recov'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='succor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='lbrv'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc-scale'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='flushbyasid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pause-filter'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pfthreshold'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='rdctl-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='mds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='gds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='rfds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='disable' name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='custom' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Dhyana-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Genoa'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='auto-ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='auto-ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-128'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-256'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-512'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v6'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v7'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='KnightsMill'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4fmaps'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4vnniw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512er'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512pf'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='KnightsMill-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4fmaps'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4vnniw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512er'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512pf'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G4-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tbm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G5-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tbm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SierraForest'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ne-convert'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cmpccxadd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SierraForest-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ne-convert'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cmpccxadd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='athlon'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='athlon-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='core2duo'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='core2duo-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='coreduo'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='coreduo-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='n270'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='n270-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='phenom'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='phenom-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <memoryBacking supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <enum name='sourceType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>file</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>anonymous</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>memfd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </memoryBacking>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <disk supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='diskDevice'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>disk</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>cdrom</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>floppy</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>lun</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='bus'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>fdc</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>scsi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>sata</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-non-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <graphics supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vnc</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>egl-headless</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>dbus</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <video supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='modelType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vga</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>cirrus</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>none</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>bochs</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ramfb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </video>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <hostdev supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='mode'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>subsystem</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='startupPolicy'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>default</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>mandatory</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>requisite</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>optional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='subsysType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pci</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>scsi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='capsType'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='pciBackend'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </hostdev>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <rng supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-non-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>random</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>egd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>builtin</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <filesystem supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='driverType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>path</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>handle</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtiofs</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </filesystem>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <tpm supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tpm-tis</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tpm-crb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>emulator</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>external</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendVersion'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>2.0</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </tpm>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <redirdev supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='bus'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </redirdev>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <channel supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pty</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>unix</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </channel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <crypto supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>qemu</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>builtin</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </crypto>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <interface supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>default</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>passt</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <panic supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>isa</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>hyperv</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </panic>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <gic supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <vmcoreinfo supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <genid supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <backingStoreInput supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <backup supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <async-teardown supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <ps2 supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <sev supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <sgx supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <hyperv supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='features'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>relaxed</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vapic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>spinlocks</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vpindex</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>runtime</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>synic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>stimer</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>reset</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vendor_id</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>frequencies</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>reenlightenment</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tlbflush</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ipi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>avic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>emsr_bitmap</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>xmm_input</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </hyperv>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <launchSecurity supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </features>
Oct  8 19:00:59 compute-0 nova_compute[117514]: </domainCapabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.616 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  8 19:00:59 compute-0 nova_compute[117514]: <domainCapabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <domain>kvm</domain>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <arch>i686</arch>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <vcpu max='240'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <iothreads supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <os supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <enum name='firmware'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <loader supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>rom</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pflash</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='readonly'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>yes</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>no</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='secure'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>no</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </loader>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </os>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='host-passthrough' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='hostPassthroughMigratable'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>on</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>off</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='maximum' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='maximumMigratable'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>on</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>off</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='host-model' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <vendor>AMD</vendor>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='x2apic'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='hypervisor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='stibp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='overflow-recov'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='succor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='lbrv'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc-scale'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='flushbyasid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pause-filter'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pfthreshold'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='rdctl-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='mds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='gds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='rfds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='disable' name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='custom' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Dhyana-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Genoa'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='auto-ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='auto-ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-128'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-256'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-512'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v6'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v7'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='KnightsMill'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4fmaps'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4vnniw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512er'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512pf'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='KnightsMill-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4fmaps'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4vnniw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512er'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512pf'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G4-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tbm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G5-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tbm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SierraForest'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ne-convert'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cmpccxadd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SierraForest-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ne-convert'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cmpccxadd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='athlon'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='athlon-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='core2duo'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='core2duo-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='coreduo'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='coreduo-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='n270'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='n270-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='phenom'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='phenom-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <memoryBacking supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <enum name='sourceType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>file</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>anonymous</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>memfd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </memoryBacking>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <disk supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='diskDevice'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>disk</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>cdrom</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>floppy</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>lun</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='bus'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ide</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>fdc</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>scsi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>sata</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-non-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <graphics supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vnc</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>egl-headless</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>dbus</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <video supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='modelType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vga</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>cirrus</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>none</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>bochs</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ramfb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </video>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <hostdev supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='mode'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>subsystem</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='startupPolicy'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>default</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>mandatory</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>requisite</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>optional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='subsysType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pci</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>scsi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='capsType'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='pciBackend'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </hostdev>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <rng supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-non-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>random</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>egd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>builtin</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <filesystem supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='driverType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>path</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>handle</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtiofs</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </filesystem>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <tpm supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tpm-tis</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tpm-crb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>emulator</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>external</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendVersion'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>2.0</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </tpm>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <redirdev supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='bus'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </redirdev>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <channel supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pty</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>unix</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </channel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <crypto supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>qemu</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>builtin</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </crypto>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <interface supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>default</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>passt</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <panic supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>isa</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>hyperv</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </panic>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <gic supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <vmcoreinfo supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <genid supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <backingStoreInput supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <backup supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <async-teardown supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <ps2 supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <sev supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <sgx supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <hyperv supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='features'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>relaxed</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vapic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>spinlocks</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vpindex</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>runtime</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>synic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>stimer</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>reset</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vendor_id</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>frequencies</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>reenlightenment</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tlbflush</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ipi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>avic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>emsr_bitmap</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>xmm_input</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </hyperv>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <launchSecurity supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </features>
Oct  8 19:00:59 compute-0 nova_compute[117514]: </domainCapabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.660 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.664 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  8 19:00:59 compute-0 nova_compute[117514]: <domainCapabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <domain>kvm</domain>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <arch>x86_64</arch>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <vcpu max='240'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <iothreads supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <os supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <enum name='firmware'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <loader supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>rom</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pflash</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='readonly'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>yes</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>no</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='secure'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>no</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </loader>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </os>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='host-passthrough' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='hostPassthroughMigratable'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>on</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>off</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='maximum' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='maximumMigratable'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>on</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>off</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='host-model' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <vendor>AMD</vendor>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='x2apic'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='hypervisor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='stibp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='overflow-recov'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='succor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='lbrv'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc-scale'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='flushbyasid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pause-filter'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pfthreshold'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='rdctl-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='mds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='gds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='rfds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='disable' name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='custom' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Dhyana-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Genoa'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='auto-ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='auto-ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-128'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-256'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-512'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v6'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v7'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='KnightsMill'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4fmaps'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4vnniw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512er'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512pf'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='KnightsMill-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4fmaps'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4vnniw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512er'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512pf'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G4-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tbm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G5-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tbm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SierraForest'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ne-convert'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cmpccxadd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SierraForest-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ne-convert'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cmpccxadd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='athlon'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='athlon-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='core2duo'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='core2duo-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='coreduo'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='coreduo-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='n270'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='n270-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='phenom'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='phenom-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <memoryBacking supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <enum name='sourceType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>file</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>anonymous</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>memfd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </memoryBacking>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <disk supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='diskDevice'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>disk</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>cdrom</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>floppy</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>lun</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='bus'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ide</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>fdc</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>scsi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>sata</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-non-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <graphics supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vnc</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>egl-headless</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>dbus</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <video supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='modelType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vga</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>cirrus</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>none</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>bochs</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ramfb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </video>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <hostdev supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='mode'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>subsystem</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='startupPolicy'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>default</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>mandatory</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>requisite</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>optional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='subsysType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pci</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>scsi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='capsType'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='pciBackend'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </hostdev>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <rng supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-non-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>random</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>egd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>builtin</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <filesystem supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='driverType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>path</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>handle</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtiofs</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </filesystem>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <tpm supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tpm-tis</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tpm-crb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>emulator</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>external</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendVersion'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>2.0</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </tpm>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <redirdev supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='bus'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </redirdev>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <channel supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pty</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>unix</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </channel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <crypto supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>qemu</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>builtin</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </crypto>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <interface supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>default</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>passt</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <panic supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>isa</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>hyperv</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </panic>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <gic supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <vmcoreinfo supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <genid supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <backingStoreInput supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <backup supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <async-teardown supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <ps2 supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <sev supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <sgx supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <hyperv supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='features'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>relaxed</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vapic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>spinlocks</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vpindex</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>runtime</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>synic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>stimer</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>reset</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vendor_id</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>frequencies</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>reenlightenment</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tlbflush</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ipi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>avic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>emsr_bitmap</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>xmm_input</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </hyperv>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <launchSecurity supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </features>
Oct  8 19:00:59 compute-0 nova_compute[117514]: </domainCapabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.729 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  8 19:00:59 compute-0 nova_compute[117514]: <domainCapabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <path>/usr/libexec/qemu-kvm</path>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <domain>kvm</domain>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <arch>x86_64</arch>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <vcpu max='4096'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <iothreads supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <os supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <enum name='firmware'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>efi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <loader supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>rom</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pflash</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='readonly'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>yes</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>no</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='secure'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>yes</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>no</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </loader>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </os>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='host-passthrough' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='hostPassthroughMigratable'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>on</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>off</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='maximum' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='maximumMigratable'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>on</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>off</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='host-model' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <vendor>AMD</vendor>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='x2apic'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc-deadline'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='hypervisor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc_adjust'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='spec-ctrl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='stibp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='arch-capabilities'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='cmp_legacy'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='overflow-recov'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='succor'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='amd-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='virt-ssbd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='lbrv'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='tsc-scale'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='vmcb-clean'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='flushbyasid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pause-filter'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pfthreshold'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='svme-addr-chk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='rdctl-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='mds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='gds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='require' name='rfds-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <feature policy='disable' name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <mode name='custom' supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Broadwell-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cascadelake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Cooperlake-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Denverton-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Dhyana-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Genoa'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='auto-ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Genoa-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='auto-ibrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Milan-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amd-psfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='no-nested-data-bp'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='null-sel-clr-base'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='stibp-always-on'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-Rome-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='EPYC-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='GraniteRapids-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-128'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-256'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx10-512'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='prefetchiti'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Haswell-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-noTSX'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v6'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Icelake-Server-v7'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='IvyBridge-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='KnightsMill'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4fmaps'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4vnniw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512er'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512pf'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='KnightsMill-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4fmaps'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-4vnniw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512er'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512pf'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G4-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tbm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Opteron_G5-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fma4'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tbm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xop'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SapphireRapids-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='amx-tile'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-bf16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-fp16'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512-vpopcntdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bitalg'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vbmi2'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrc'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fzrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='la57'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='taa-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='tsx-ldtrk'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xfd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SierraForest'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ne-convert'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cmpccxadd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='SierraForest-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ifma'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-ne-convert'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx-vnni-int8'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='bus-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cmpccxadd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fbsdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='fsrs'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ibrs-all'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mcdt-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pbrsb-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='psdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='sbdr-ssdp-no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='serialize'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vaes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='vpclmulqdq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Client-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='hle'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='rtm'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Skylake-Server-v5'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512bw'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512cd'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512dq'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512f'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='avx512vl'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='invpcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pcid'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='pku'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='mpx'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v2'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v3'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='core-capability'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='split-lock-detect'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='Snowridge-v4'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='cldemote'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='erms'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='gfni'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdir64b'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='movdiri'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='xsaves'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='athlon'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='athlon-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='core2duo'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='core2duo-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='coreduo'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='coreduo-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='n270'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='n270-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='ss'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='phenom'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <blockers model='phenom-v1'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnow'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <feature name='3dnowext'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </blockers>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </mode>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <memoryBacking supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <enum name='sourceType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>file</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>anonymous</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <value>memfd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </memoryBacking>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <disk supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='diskDevice'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>disk</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>cdrom</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>floppy</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>lun</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='bus'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>fdc</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>scsi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>sata</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-non-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <graphics supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vnc</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>egl-headless</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>dbus</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <video supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='modelType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vga</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>cirrus</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>none</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>bochs</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ramfb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </video>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <hostdev supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='mode'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>subsystem</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='startupPolicy'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>default</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>mandatory</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>requisite</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>optional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='subsysType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pci</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>scsi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='capsType'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='pciBackend'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </hostdev>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <rng supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtio-non-transitional</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>random</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>egd</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>builtin</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <filesystem supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='driverType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>path</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>handle</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>virtiofs</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </filesystem>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <tpm supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tpm-tis</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tpm-crb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>emulator</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>external</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendVersion'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>2.0</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </tpm>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <redirdev supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='bus'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>usb</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </redirdev>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <channel supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>pty</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>unix</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </channel>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <crypto supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='type'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>qemu</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendModel'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>builtin</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </crypto>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <interface supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='backendType'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>default</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>passt</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <panic supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='model'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>isa</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>hyperv</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </panic>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  <features>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <gic supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <vmcoreinfo supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <genid supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <backingStoreInput supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <backup supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <async-teardown supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <ps2 supported='yes'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <sev supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <sgx supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <hyperv supported='yes'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      <enum name='features'>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>relaxed</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vapic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>spinlocks</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vpindex</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>runtime</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>synic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>stimer</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>reset</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>vendor_id</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>frequencies</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>reenlightenment</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>tlbflush</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>ipi</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>avic</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>emsr_bitmap</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:        <value>xmm_input</value>
Oct  8 19:00:59 compute-0 nova_compute[117514]:      </enum>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    </hyperv>
Oct  8 19:00:59 compute-0 nova_compute[117514]:    <launchSecurity supported='no'/>
Oct  8 19:00:59 compute-0 nova_compute[117514]:  </features>
Oct  8 19:00:59 compute-0 nova_compute[117514]: </domainCapabilities>
Oct  8 19:00:59 compute-0 nova_compute[117514]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.784 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.785 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.785 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.785 2 INFO nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Secure Boot support detected#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.788 2 INFO nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.788 2 INFO nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.798 2 DEBUG nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.828 2 INFO nova.virt.node [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Determined node identity 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from /var/lib/nova/compute_id#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.846 2 WARNING nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Compute nodes ['8dadd82c-8ff0-43f1-888f-64abe8b5e349'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.885 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.922 2 WARNING nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.922 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.922 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.923 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:00:59 compute-0 nova_compute[117514]: 2025-10-08 19:00:59.923 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:00:59 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Oct  8 19:00:59 compute-0 systemd[1]: Started libvirt nodedev daemon.
Oct  8 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.243 2 WARNING nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.243 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6524MB free_disk=73.64043045043945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.244 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.244 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.255 2 WARNING nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] No compute node record for compute-0.ctlplane.example.com:8dadd82c-8ff0-43f1-888f-64abe8b5e349: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 8dadd82c-8ff0-43f1-888f-64abe8b5e349 could not be found.#033[00m
Oct  8 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.268 2 INFO nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 8dadd82c-8ff0-43f1-888f-64abe8b5e349#033[00m
Oct  8 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.361 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:01:00 compute-0 nova_compute[117514]: 2025-10-08 19:01:00.361 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.470 2 INFO nova.scheduler.client.report [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [req-524900c9-b384-4f60-b315-b075558689a7] Created resource provider record via placement API for resource provider with UUID 8dadd82c-8ff0-43f1-888f-64abe8b5e349 and name compute-0.ctlplane.example.com.#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.850 2 DEBUG nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct  8 19:01:01 compute-0 nova_compute[117514]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.851 2 INFO nova.virt.libvirt.host [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] kernel doesn't support AMD SEV#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.852 2 DEBUG nova.compute.provider_tree [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.852 2 DEBUG nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.893 2 DEBUG nova.scheduler.client.report [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updated inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.893 2 DEBUG nova.compute.provider_tree [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updating resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.893 2 DEBUG nova.compute.provider_tree [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 19:01:01 compute-0 nova_compute[117514]: 2025-10-08 19:01:01.991 2 DEBUG nova.compute.provider_tree [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Updating resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  8 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.013 2 DEBUG nova.compute.resource_tracker [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.013 2 DEBUG oslo_concurrency.lockutils [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.770s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.013 2 DEBUG nova.service [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Oct  8 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.099 2 DEBUG nova.service [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Oct  8 19:01:02 compute-0 nova_compute[117514]: 2025-10-08 19:01:02.100 2 DEBUG nova.servicegroup.drivers.db [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Oct  8 19:01:03 compute-0 systemd-logind[844]: New session 11 of user zuul.
Oct  8 19:01:03 compute-0 systemd[1]: Started Session 11 of User zuul.
Oct  8 19:01:04 compute-0 python3.9[118014]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  8 19:01:05 compute-0 auditd[775]: Audit daemon rotating log files
Oct  8 19:01:06 compute-0 python3.9[118170]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 19:01:06 compute-0 systemd[1]: Reloading.
Oct  8 19:01:06 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:01:06 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:01:07 compute-0 python3.9[118356]: ansible-ansible.builtin.service_facts Invoked
Oct  8 19:01:07 compute-0 network[118373]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  8 19:01:07 compute-0 network[118374]: 'network-scripts' will be removed from distribution in near future.
Oct  8 19:01:07 compute-0 network[118375]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  8 19:01:12 compute-0 podman[118526]: 2025-10-08 19:01:12.685814595 +0000 UTC m=+0.095871048 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 19:01:13 compute-0 python3.9[118672]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 19:01:14 compute-0 python3.9[118825]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:15 compute-0 python3.9[118977]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:15 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 19:01:15 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 19:01:15 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 19:01:15 compute-0 python3.9[119130]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:01:16 compute-0 python3.9[119282]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 19:01:17 compute-0 python3.9[119434]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 19:01:17 compute-0 systemd[1]: Reloading.
Oct  8 19:01:17 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:01:17 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:01:18 compute-0 nova_compute[117514]: 2025-10-08 19:01:18.102 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:18 compute-0 nova_compute[117514]: 2025-10-08 19:01:18.137 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:18 compute-0 podman[119594]: 2025-10-08 19:01:18.657211402 +0000 UTC m=+0.084215974 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 19:01:18 compute-0 python3.9[119642]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:01:19 compute-0 podman[119767]: 2025-10-08 19:01:19.498677312 +0000 UTC m=+0.066400713 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:01:19 compute-0 python3.9[119814]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:01:20 compute-0 podman[119938]: 2025-10-08 19:01:20.538019443 +0000 UTC m=+0.138839009 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 19:01:20 compute-0 python3.9[119977]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:01:21 compute-0 python3.9[120142]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:22 compute-0 python3.9[120263]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950080.9010713-133-277864867710341/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:01:23 compute-0 python3.9[120415]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct  8 19:01:24 compute-0 python3.9[120567]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct  8 19:01:24 compute-0 python3.9[120720]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  8 19:01:26 compute-0 python3.9[120878]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  8 19:01:27 compute-0 python3.9[121036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:27 compute-0 python3.9[121157]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759950086.776381-201-105578576111959/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:28 compute-0 python3.9[121307]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:29 compute-0 python3.9[121428]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759950088.1281173-201-132609462567754/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:30 compute-0 python3.9[121578]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:30 compute-0 python3.9[121699]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759950089.5040212-201-19806015132249/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:31 compute-0 python3.9[121849]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:01:32 compute-0 python3.9[122001]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:01:32 compute-0 python3.9[122153]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:33 compute-0 python3.9[122274]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950092.401349-260-234548934646737/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:34 compute-0 python3.9[122424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:34 compute-0 python3.9[122500]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:35 compute-0 python3.9[122650]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:36 compute-0 python3.9[122771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950094.9570293-260-38171448417766/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=6747f2067b9284624d06fbad47fbd56de1e9892c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:36 compute-0 python3.9[122921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:37 compute-0 python3.9[123042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950096.2232506-260-55177451141639/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:38 compute-0 python3.9[123192]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:38 compute-0 python3.9[123313]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950097.5152826-260-84171440704575/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:39 compute-0 python3.9[123463]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:39 compute-0 python3.9[123584]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950098.84073-260-166098265098715/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=3820eb6e48c35431ebf53228213a5d51b7591223 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:40 compute-0 python3.9[123735]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:41 compute-0 python3.9[123856]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950100.127853-260-253228229858692/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:41 compute-0 python3.9[124006]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:42 compute-0 python3.9[124127]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950101.361017-260-280697138710755/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=33df3bf08923ad9105770f5abb51d4cde791931a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:42 compute-0 podman[124251]: 2025-10-08 19:01:42.943652628 +0000 UTC m=+0.061486843 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:01:43 compute-0 python3.9[124292]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:43 compute-0 python3.9[124418]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950102.61923-260-110516510466088/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:01:44.220 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:01:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:01:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:01:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:01:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:01:44 compute-0 python3.9[124568]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:44 compute-0 python3.9[124689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950103.8702881-260-216314575236660/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=8bed8129af2c9145e8d37569bb493c0de1895d6f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:45 compute-0 python3.9[124839]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:46 compute-0 python3.9[124960]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950105.1166286-260-225012113742851/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:47 compute-0 python3.9[125110]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:47 compute-0 python3.9[125186]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:48 compute-0 python3.9[125336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:48 compute-0 python3.9[125412]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:48 compute-0 podman[125413]: 2025-10-08 19:01:48.99684043 +0000 UTC m=+0.086904631 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:01:49 compute-0 podman[125583]: 2025-10-08 19:01:49.646125153 +0000 UTC m=+0.059698082 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  8 19:01:49 compute-0 python3.9[125582]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:50 compute-0 python3.9[125677]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:50 compute-0 podman[125801]: 2025-10-08 19:01:50.799170651 +0000 UTC m=+0.134366091 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 19:01:50 compute-0 python3.9[125849]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:51 compute-0 python3.9[126008]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:01:52 compute-0 python3.9[126160]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:01:53 compute-0 python3.9[126312]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 19:01:53 compute-0 systemd[1]: Reloading.
Oct  8 19:01:53 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:01:53 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:01:53 compute-0 systemd[1]: Listening on Podman API Socket.
Oct  8 19:01:54 compute-0 python3.9[126504]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:55 compute-0 python3.9[126627]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950114.1005082-482-101389055198168/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:01:55 compute-0 python3.9[126703]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:01:56 compute-0 python3.9[126826]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950114.1005082-482-101389055198168/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:01:57 compute-0 python3.9[126978]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Oct  8 19:01:58 compute-0 python3.9[127130]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.719 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.720 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.721 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.721 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.769 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.769 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.770 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.770 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.771 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.771 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.772 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.772 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.773 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.801 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.802 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.802 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:01:58 compute-0 nova_compute[117514]: 2025-10-08 19:01:58.803 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.006 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.007 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6529MB free_disk=73.6331787109375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.007 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.008 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.065 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.066 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.097 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.110 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.112 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:01:59 compute-0 nova_compute[117514]: 2025-10-08 19:01:59.112 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:01:59 compute-0 python3[127282]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 19:01:59 compute-0 podman[127319]: 2025-10-08 19:01:59.9479515 +0000 UTC m=+0.057012564 container create e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Oct  8 19:01:59 compute-0 podman[127319]: 2025-10-08 19:01:59.916845229 +0000 UTC m=+0.025906283 image pull 5397cd841d80292a5786d82cb8a2bcd574988efb08c605ba6eaaa59d6f646815 quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189
Oct  8 19:01:59 compute-0 python3[127282]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189 kolla_start
Oct  8 19:02:00 compute-0 python3.9[127508]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:02:01 compute-0 python3.9[127662]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:02 compute-0 python3.9[127813]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950121.8551555-546-74308157265001/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:03 compute-0 python3.9[127889]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 19:02:03 compute-0 systemd[1]: Reloading.
Oct  8 19:02:03 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:02:03 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:02:04 compute-0 python3.9[128000]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 19:02:04 compute-0 systemd[1]: Reloading.
Oct  8 19:02:04 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:02:04 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:02:05 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Oct  8 19:02:05 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:02:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:05 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:05 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.
Oct  8 19:02:05 compute-0 podman[128040]: 2025-10-08 19:02:05.274366285 +0000 UTC m=+0.158798209 container init e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + sudo -E kolla_set_configs
Oct  8 19:02:05 compute-0 podman[128040]: 2025-10-08 19:02:05.309051018 +0000 UTC m=+0.193482892 container start e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:02:05 compute-0 podman[128040]: ceilometer_agent_compute
Oct  8 19:02:05 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Oct  8 19:02:05 compute-0 rsyslogd[1288]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: sudo: unable to send audit message: Operation not permitted
Oct  8 19:02:05 compute-0 podman[128061]: 2025-10-08 19:02:05.376227932 +0000 UTC m=+0.058904388 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 19:02:05 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-52b4da775efe27cc.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 19:02:05 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-52b4da775efe27cc.service: Failed with result 'exit-code'.
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Validating config file
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying service configuration files
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: INFO:__main__:Writing out command to execute
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: ++ cat /run_command
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + ARGS=
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + sudo kolla_copy_cacerts
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: sudo: unable to send audit message: Operation not permitted
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + [[ ! -n '' ]]
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + . kolla_extend_start
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + umask 0022
Oct  8 19:02:05 compute-0 ceilometer_agent_compute[128055]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  8 19:02:06 compute-0 python3.9[128240]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 19:02:06 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Oct  8 19:02:06 compute-0 systemd[1]: libpod-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope: Deactivated successfully.
Oct  8 19:02:06 compute-0 podman[128244]: 2025-10-08 19:02:06.352771451 +0000 UTC m=+0.041176500 container died e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:02:06 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-52b4da775efe27cc.timer: Deactivated successfully.
Oct  8 19:02:06 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.
Oct  8 19:02:06 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-userdata-shm.mount: Deactivated successfully.
Oct  8 19:02:06 compute-0 systemd[1]: var-lib-containers-storage-overlay-0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590-merged.mount: Deactivated successfully.
Oct  8 19:02:06 compute-0 podman[128244]: 2025-10-08 19:02:06.433313738 +0000 UTC m=+0.121718787 container cleanup e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 19:02:06 compute-0 podman[128244]: ceilometer_agent_compute
Oct  8 19:02:06 compute-0 podman[128274]: ceilometer_agent_compute
Oct  8 19:02:06 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Oct  8 19:02:06 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Oct  8 19:02:06 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Oct  8 19:02:06 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0209ee27825f3f77b76d9449285ecee2d0c8c3ef3bf4e58b622139dc871d8590/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:06 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.
Oct  8 19:02:06 compute-0 podman[128287]: 2025-10-08 19:02:06.683115022 +0000 UTC m=+0.155092893 container init e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + sudo -E kolla_set_configs
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: sudo: unable to send audit message: Operation not permitted
Oct  8 19:02:06 compute-0 podman[128287]: 2025-10-08 19:02:06.726984719 +0000 UTC m=+0.198962590 container start e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:02:06 compute-0 podman[128287]: ceilometer_agent_compute
Oct  8 19:02:06 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Validating config file
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying service configuration files
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: INFO:__main__:Writing out command to execute
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: ++ cat /run_command
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + ARGS=
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + sudo kolla_copy_cacerts
Oct  8 19:02:06 compute-0 podman[128310]: 2025-10-08 19:02:06.817834371 +0000 UTC m=+0.073486206 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 19:02:06 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-39eec91179f1428e.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 19:02:06 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-39eec91179f1428e.service: Failed with result 'exit-code'.
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: sudo: unable to send audit message: Operation not permitted
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + [[ ! -n '' ]]
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + . kolla_extend_start
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + umask 0022
Oct  8 19:02:06 compute-0 ceilometer_agent_compute[128303]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  8 19:02:07 compute-0 python3.9[128487]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.903 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.903 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.903 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.904 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.904 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.904 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.905 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.906 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.906 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.906 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.906 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.907 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.908 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.909 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.909 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.909 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.909 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.910 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.910 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.910 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.910 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.911 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.912 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.913 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.914 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.914 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.914 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.914 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.915 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.916 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.917 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.918 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.919 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.920 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.920 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.920 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.920 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.921 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.922 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.923 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.924 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.925 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.926 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.927 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.928 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.929 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.946 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.947 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  8 19:02:07 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:07.948 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.039 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  8 19:02:08 compute-0 python3.9[128611]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950126.9605684-578-110074488624178/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.198 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.199 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.200 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.200 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.200 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.200 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.201 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.202 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.203 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.204 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.205 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.206 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.207 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.209 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.210 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.211 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.212 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.213 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.214 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.215 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.218 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.219 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.220 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.221 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.222 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.223 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.224 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.225 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.226 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.230 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.237 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:02:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:02:09 compute-0 python3.9[128768]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Oct  8 19:02:10 compute-0 python3.9[128920]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 19:02:11 compute-0 python3[129072]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 19:02:11 compute-0 podman[129107]: 2025-10-08 19:02:11.275343904 +0000 UTC m=+0.064693834 container create 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:02:11 compute-0 podman[129107]: 2025-10-08 19:02:11.241813904 +0000 UTC m=+0.031163844 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Oct  8 19:02:11 compute-0 python3[129072]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Oct  8 19:02:12 compute-0 python3.9[129297]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:02:12 compute-0 python3.9[129451]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:13 compute-0 podman[129574]: 2025-10-08 19:02:13.578913339 +0000 UTC m=+0.088195716 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd)
Oct  8 19:02:13 compute-0 python3.9[129617]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950133.0007854-631-156586757807026/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:14 compute-0 python3.9[129694]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 19:02:14 compute-0 systemd[1]: Reloading.
Oct  8 19:02:14 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:02:14 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:02:14 compute-0 systemd[1]: Starting dnf makecache...
Oct  8 19:02:14 compute-0 dnf[129729]: Repository 'gating-repo' is missing name in configuration, using id.
Oct  8 19:02:14 compute-0 dnf[129729]: Metadata cache refreshed recently.
Oct  8 19:02:14 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  8 19:02:14 compute-0 systemd[1]: Finished dnf makecache.
Oct  8 19:02:15 compute-0 python3.9[129805]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 19:02:15 compute-0 systemd[1]: Reloading.
Oct  8 19:02:15 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:02:15 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:02:15 compute-0 systemd[1]: Starting node_exporter container...
Oct  8 19:02:15 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:02:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:15 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.
Oct  8 19:02:15 compute-0 podman[129846]: 2025-10-08 19:02:15.805636323 +0000 UTC m=+0.182451777 container init 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 19:02:15 compute-0 podman[129846]: 2025-10-08 19:02:15.83732952 +0000 UTC m=+0.214144934 container start 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:02:15 compute-0 podman[129846]: node_exporter
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.852Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  8 19:02:15 compute-0 systemd[1]: Started node_exporter container.
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.852Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.853Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.854Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.855Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=arp
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=bcache
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=bonding
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=cpu
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=edac
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=filefd
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=netclass
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=netdev
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=netstat
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=nfs
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=nvme
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=softnet
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=systemd
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=xfs
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.856Z caller=node_exporter.go:117 level=info collector=zfs
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.857Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  8 19:02:15 compute-0 node_exporter[129862]: ts=2025-10-08T19:02:15.857Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  8 19:02:15 compute-0 podman[129867]: 2025-10-08 19:02:15.92599526 +0000 UTC m=+0.079518439 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:02:16 compute-0 python3.9[130046]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 19:02:16 compute-0 systemd[1]: Stopping node_exporter container...
Oct  8 19:02:16 compute-0 systemd[1]: libpod-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope: Deactivated successfully.
Oct  8 19:02:16 compute-0 podman[130050]: 2025-10-08 19:02:16.850012454 +0000 UTC m=+0.045942197 container died 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:02:16 compute-0 systemd[1]: 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213-3eedbe080720916f.timer: Deactivated successfully.
Oct  8 19:02:16 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.
Oct  8 19:02:16 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213-userdata-shm.mount: Deactivated successfully.
Oct  8 19:02:16 compute-0 systemd[1]: var-lib-containers-storage-overlay-da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995-merged.mount: Deactivated successfully.
Oct  8 19:02:16 compute-0 podman[130050]: 2025-10-08 19:02:16.976453116 +0000 UTC m=+0.172382849 container cleanup 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:02:16 compute-0 podman[130050]: node_exporter
Oct  8 19:02:16 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  8 19:02:17 compute-0 podman[130083]: node_exporter
Oct  8 19:02:17 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Oct  8 19:02:17 compute-0 systemd[1]: Stopped node_exporter container.
Oct  8 19:02:17 compute-0 systemd[1]: Starting node_exporter container...
Oct  8 19:02:17 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:02:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/da29446cf246f7a9927e318b3b3f382c7ef69cd14bb17f2b6f9a6d3043149995/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:17 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.
Oct  8 19:02:17 compute-0 podman[130095]: 2025-10-08 19:02:17.26572442 +0000 UTC m=+0.173198011 container init 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.282Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.282Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.282Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.283Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.283Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.283Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.283Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=node_exporter.go:117 level=info collector=arp
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=node_exporter.go:117 level=info collector=bcache
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.284Z caller=node_exporter.go:117 level=info collector=bonding
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=cpu
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=edac
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=filefd
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=netclass
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=netdev
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=netstat
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=nfs
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=nvme
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=softnet
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=systemd
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=xfs
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.285Z caller=node_exporter.go:117 level=info collector=zfs
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.286Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  8 19:02:17 compute-0 node_exporter[130110]: ts=2025-10-08T19:02:17.286Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  8 19:02:17 compute-0 podman[130095]: 2025-10-08 19:02:17.296461431 +0000 UTC m=+0.203935042 container start 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:02:17 compute-0 podman[130095]: node_exporter
Oct  8 19:02:17 compute-0 systemd[1]: Started node_exporter container.
Oct  8 19:02:17 compute-0 podman[130120]: 2025-10-08 19:02:17.392571913 +0000 UTC m=+0.083825231 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:02:18 compute-0 python3.9[130295]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:02:18 compute-0 python3.9[130418]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950137.550567-663-61679827133784/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:02:19 compute-0 podman[130542]: 2025-10-08 19:02:19.390992488 +0000 UTC m=+0.093152519 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:02:19 compute-0 python3.9[130589]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct  8 19:02:20 compute-0 podman[130714]: 2025-10-08 19:02:20.254301773 +0000 UTC m=+0.063005975 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct  8 19:02:20 compute-0 python3.9[130761]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 19:02:21 compute-0 podman[130885]: 2025-10-08 19:02:21.178442741 +0000 UTC m=+0.159820028 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 19:02:21 compute-0 python3[130930]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 19:02:22 compute-0 podman[130952]: 2025-10-08 19:02:22.790830519 +0000 UTC m=+1.361819744 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Oct  8 19:02:22 compute-0 podman[131050]: 2025-10-08 19:02:22.917630501 +0000 UTC m=+0.037460794 container create 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Oct  8 19:02:22 compute-0 podman[131050]: 2025-10-08 19:02:22.898724209 +0000 UTC m=+0.018554512 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Oct  8 19:02:22 compute-0 python3[130930]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct  8 19:02:23 compute-0 python3.9[131240]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:02:24 compute-0 python3.9[131394]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:25 compute-0 python3.9[131545]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950144.8437123-716-136675860338874/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:26 compute-0 python3.9[131621]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 19:02:26 compute-0 systemd[1]: Reloading.
Oct  8 19:02:26 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:02:26 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:02:27 compute-0 python3.9[131732]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 19:02:27 compute-0 systemd[1]: Reloading.
Oct  8 19:02:27 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:02:27 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:02:27 compute-0 systemd[1]: Starting podman_exporter container...
Oct  8 19:02:27 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:02:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:27 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.
Oct  8 19:02:27 compute-0 podman[131772]: 2025-10-08 19:02:27.673460189 +0000 UTC m=+0.137762107 container init 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.699Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  8 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.699Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  8 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.699Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  8 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.699Z caller=handler.go:105 level=info collector=container
Oct  8 19:02:27 compute-0 podman[131772]: 2025-10-08 19:02:27.712120336 +0000 UTC m=+0.176422184 container start 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:02:27 compute-0 podman[131772]: podman_exporter
Oct  8 19:02:27 compute-0 systemd[1]: Starting Podman API Service...
Oct  8 19:02:27 compute-0 systemd[1]: Started podman_exporter container.
Oct  8 19:02:27 compute-0 systemd[1]: Started Podman API Service.
Oct  8 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct  8 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="Setting parallel job count to 25"
Oct  8 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="Using sqlite as database backend"
Oct  8 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct  8 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct  8 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct  8 19:02:27 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:27 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  8 19:02:27 compute-0 podman[131799]: time="2025-10-08T19:02:27Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  8 19:02:27 compute-0 podman[131798]: 2025-10-08 19:02:27.820941943 +0000 UTC m=+0.095170177 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:02:27 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-8d3396b962ac6b3.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 19:02:27 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-8d3396b962ac6b3.service: Failed with result 'exit-code'.
Oct  8 19:02:27 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:27 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 23086 "" "Go-http-client/1.1"
Oct  8 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.845Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  8 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.846Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  8 19:02:27 compute-0 podman_exporter[131788]: ts=2025-10-08T19:02:27.846Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  8 19:02:28 compute-0 python3.9[131985]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 19:02:28 compute-0 systemd[1]: Stopping podman_exporter container...
Oct  8 19:02:28 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:27 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1641 "" "Go-http-client/1.1"
Oct  8 19:02:28 compute-0 systemd[1]: libpod-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope: Deactivated successfully.
Oct  8 19:02:28 compute-0 podman[131989]: 2025-10-08 19:02:28.806586941 +0000 UTC m=+0.069544813 container died 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:02:28 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-8d3396b962ac6b3.timer: Deactivated successfully.
Oct  8 19:02:28 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.
Oct  8 19:02:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-userdata-shm.mount: Deactivated successfully.
Oct  8 19:02:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a-merged.mount: Deactivated successfully.
Oct  8 19:02:29 compute-0 podman[131989]: 2025-10-08 19:02:29.141480422 +0000 UTC m=+0.404438294 container cleanup 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:02:29 compute-0 podman[131989]: podman_exporter
Oct  8 19:02:29 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  8 19:02:29 compute-0 podman[132022]: podman_exporter
Oct  8 19:02:29 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct  8 19:02:29 compute-0 systemd[1]: Stopped podman_exporter container.
Oct  8 19:02:29 compute-0 systemd[1]: Starting podman_exporter container...
Oct  8 19:02:29 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:02:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:29 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a326ecbc4abca4f807e2a12af1256dd5205b8dffe1da0f2d2bd48cfaf93a0a8a/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:29 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.
Oct  8 19:02:29 compute-0 podman[132035]: 2025-10-08 19:02:29.499506337 +0000 UTC m=+0.232656275 container init 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.522Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  8 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.522Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  8 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.522Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  8 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.522Z caller=handler.go:105 level=info collector=container
Oct  8 19:02:29 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:29 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  8 19:02:29 compute-0 podman[131799]: time="2025-10-08T19:02:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  8 19:02:29 compute-0 podman[132035]: 2025-10-08 19:02:29.54815699 +0000 UTC m=+0.281306868 container start 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:02:29 compute-0 podman[132035]: podman_exporter
Oct  8 19:02:29 compute-0 systemd[1]: Started podman_exporter container.
Oct  8 19:02:29 compute-0 podman[131799]: @ - - [08/Oct/2025:19:02:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 23088 "" "Go-http-client/1.1"
Oct  8 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.648Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  8 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.649Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  8 19:02:29 compute-0 podman_exporter[132050]: ts=2025-10-08T19:02:29.650Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  8 19:02:29 compute-0 podman[132060]: 2025-10-08 19:02:29.65326067 +0000 UTC m=+0.095450535 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:02:29 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-7d9abd0262b7ee7b.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 19:02:29 compute-0 systemd[1]: 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d-7d9abd0262b7ee7b.service: Failed with result 'exit-code'.
Oct  8 19:02:30 compute-0 python3.9[132233]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:02:31 compute-0 python3.9[132356]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759950149.851642-748-83928741393237/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  8 19:02:32 compute-0 python3.9[132508]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct  8 19:02:32 compute-0 python3.9[132660]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  8 19:02:33 compute-0 python3[132812]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  8 19:02:36 compute-0 podman[132825]: 2025-10-08 19:02:36.353532347 +0000 UTC m=+2.521581659 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct  8 19:02:36 compute-0 podman[132923]: 2025-10-08 19:02:36.569637986 +0000 UTC m=+0.116159608 container create 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Oct  8 19:02:36 compute-0 podman[132923]: 2025-10-08 19:02:36.487449672 +0000 UTC m=+0.033971334 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct  8 19:02:36 compute-0 python3[132812]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7
Oct  8 19:02:37 compute-0 podman[133085]: 2025-10-08 19:02:37.353323491 +0000 UTC m=+0.083177403 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  8 19:02:37 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-39eec91179f1428e.service: Main process exited, code=exited, status=1/FAILURE
Oct  8 19:02:37 compute-0 systemd[1]: e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f-39eec91179f1428e.service: Failed with result 'exit-code'.
Oct  8 19:02:37 compute-0 python3.9[133133]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:02:38 compute-0 python3.9[133287]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:39 compute-0 python3.9[133438]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759950158.44593-801-150962681282531/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:39 compute-0 python3.9[133514]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  8 19:02:39 compute-0 systemd[1]: Reloading.
Oct  8 19:02:39 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:02:39 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:02:40 compute-0 python3.9[133625]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  8 19:02:40 compute-0 systemd[1]: Reloading.
Oct  8 19:02:40 compute-0 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  8 19:02:40 compute-0 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  8 19:02:41 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct  8 19:02:41 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:02:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:41 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.
Oct  8 19:02:41 compute-0 podman[133664]: 2025-10-08 19:02:41.347015611 +0000 UTC m=+0.158865931 container init 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350)
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *bridge.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *coverage.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *datapath.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *iface.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *memory.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *ovnnorthd.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *ovn.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *ovsdbserver.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *pmd_perf.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *pmd_rxq.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: INFO    19:02:41 main.go:48: registering *vswitch.Collector
Oct  8 19:02:41 compute-0 openstack_network_exporter[133679]: NOTICE  19:02:41 main.go:76: listening on https://:9105/metrics
Oct  8 19:02:41 compute-0 podman[133664]: 2025-10-08 19:02:41.378071531 +0000 UTC m=+0.189921841 container start 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, distribution-scope=public)
Oct  8 19:02:41 compute-0 podman[133664]: openstack_network_exporter
Oct  8 19:02:41 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct  8 19:02:41 compute-0 podman[133689]: 2025-10-08 19:02:41.5030291 +0000 UTC m=+0.107759178 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public)
Oct  8 19:02:42 compute-0 python3.9[133864]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  8 19:02:42 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Oct  8 19:02:42 compute-0 systemd[1]: libpod-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope: Deactivated successfully.
Oct  8 19:02:42 compute-0 podman[133868]: 2025-10-08 19:02:42.462501279 +0000 UTC m=+0.059975859 container died 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 19:02:42 compute-0 systemd[1]: 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2-71d9224341fc07c4.timer: Deactivated successfully.
Oct  8 19:02:42 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.
Oct  8 19:02:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2-userdata-shm.mount: Deactivated successfully.
Oct  8 19:02:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904-merged.mount: Deactivated successfully.
Oct  8 19:02:42 compute-0 podman[133868]: 2025-10-08 19:02:42.938209512 +0000 UTC m=+0.535684112 container cleanup 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 19:02:42 compute-0 podman[133868]: openstack_network_exporter
Oct  8 19:02:42 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  8 19:02:43 compute-0 podman[133897]: openstack_network_exporter
Oct  8 19:02:43 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct  8 19:02:43 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Oct  8 19:02:43 compute-0 systemd[1]: Starting openstack_network_exporter container...
Oct  8 19:02:43 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60cc909f2f382fd0d56d9aa7171681d4a4fdb2c9939442427688045fe76af904/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  8 19:02:43 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.
Oct  8 19:02:43 compute-0 podman[133910]: 2025-10-08 19:02:43.197745155 +0000 UTC m=+0.141550175 container init 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, maintainer=Red Hat, Inc.)
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *bridge.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *coverage.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *datapath.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *iface.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *memory.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *ovnnorthd.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *ovn.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *ovsdbserver.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *pmd_perf.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *pmd_rxq.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: INFO    19:02:43 main.go:48: registering *vswitch.Collector
Oct  8 19:02:43 compute-0 openstack_network_exporter[133927]: NOTICE  19:02:43 main.go:76: listening on https://:9105/metrics
Oct  8 19:02:43 compute-0 podman[133910]: 2025-10-08 19:02:43.228464315 +0000 UTC m=+0.172269335 container start 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Oct  8 19:02:43 compute-0 podman[133910]: openstack_network_exporter
Oct  8 19:02:43 compute-0 systemd[1]: Started openstack_network_exporter container.
Oct  8 19:02:43 compute-0 podman[133937]: 2025-10-08 19:02:43.31768502 +0000 UTC m=+0.077053467 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, distribution-scope=public)
Oct  8 19:02:43 compute-0 podman[134081]: 2025-10-08 19:02:43.846125435 +0000 UTC m=+0.075084981 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 19:02:44 compute-0 python3.9[134127]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  8 19:02:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:02:44.221 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:02:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:02:44.222 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:02:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:02:44.222 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:02:45 compute-0 python3.9[134279]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct  8 19:02:46 compute-0 python3.9[134444]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:02:46 compute-0 systemd[1]: Started libpod-conmon-4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.scope.
Oct  8 19:02:46 compute-0 podman[134445]: 2025-10-08 19:02:46.300253762 +0000 UTC m=+0.111439612 container exec 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:02:46 compute-0 podman[134445]: 2025-10-08 19:02:46.307711906 +0000 UTC m=+0.118897716 container exec_died 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 19:02:46 compute-0 systemd[1]: libpod-conmon-4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.scope: Deactivated successfully.
Oct  8 19:02:47 compute-0 python3.9[134629]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:02:47 compute-0 systemd[1]: Started libpod-conmon-4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.scope.
Oct  8 19:02:47 compute-0 podman[134630]: 2025-10-08 19:02:47.23768518 +0000 UTC m=+0.104497144 container exec 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:02:47 compute-0 podman[134630]: 2025-10-08 19:02:47.271591741 +0000 UTC m=+0.138403715 container exec_died 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 19:02:47 compute-0 systemd[1]: libpod-conmon-4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59.scope: Deactivated successfully.
Oct  8 19:02:47 compute-0 podman[134711]: 2025-10-08 19:02:47.681165221 +0000 UTC m=+0.084671566 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:02:48 compute-0 python3.9[134839]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:48 compute-0 python3.9[134991]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct  8 19:02:49 compute-0 podman[135127]: 2025-10-08 19:02:49.645056098 +0000 UTC m=+0.064473608 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  8 19:02:49 compute-0 python3.9[135176]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:02:49 compute-0 systemd[1]: Started libpod-conmon-80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.scope.
Oct  8 19:02:50 compute-0 podman[135177]: 2025-10-08 19:02:50.006089198 +0000 UTC m=+0.097501244 container exec 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 19:02:50 compute-0 podman[135177]: 2025-10-08 19:02:50.044303832 +0000 UTC m=+0.135715828 container exec_died 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 19:02:50 compute-0 systemd[1]: libpod-conmon-80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.scope: Deactivated successfully.
Oct  8 19:02:50 compute-0 podman[135330]: 2025-10-08 19:02:50.625725344 +0000 UTC m=+0.075392371 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:02:50 compute-0 python3.9[135377]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:02:50 compute-0 systemd[1]: Started libpod-conmon-80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.scope.
Oct  8 19:02:50 compute-0 podman[135378]: 2025-10-08 19:02:50.986305311 +0000 UTC m=+0.102194348 container exec 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001)
Oct  8 19:02:50 compute-0 podman[135378]: 2025-10-08 19:02:50.996243955 +0000 UTC m=+0.112132972 container exec_died 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 19:02:51 compute-0 systemd[1]: libpod-conmon-80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a.scope: Deactivated successfully.
Oct  8 19:02:51 compute-0 podman[135533]: 2025-10-08 19:02:51.728463786 +0000 UTC m=+0.126473343 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 19:02:51 compute-0 python3.9[135583]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:52 compute-0 python3.9[135740]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct  8 19:02:53 compute-0 python3.9[135906]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:02:54 compute-0 systemd[1]: Started libpod-conmon-3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.scope.
Oct  8 19:02:54 compute-0 podman[135907]: 2025-10-08 19:02:54.044731735 +0000 UTC m=+0.295569947 container exec 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=iscsid)
Oct  8 19:02:54 compute-0 podman[135907]: 2025-10-08 19:02:54.081313722 +0000 UTC m=+0.332151884 container exec_died 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3)
Oct  8 19:02:54 compute-0 systemd[1]: libpod-conmon-3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.scope: Deactivated successfully.
Oct  8 19:02:54 compute-0 python3.9[136093]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:02:55 compute-0 systemd[1]: Started libpod-conmon-3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.scope.
Oct  8 19:02:55 compute-0 podman[136094]: 2025-10-08 19:02:55.064764849 +0000 UTC m=+0.078507210 container exec 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  8 19:02:55 compute-0 podman[136094]: 2025-10-08 19:02:55.097328671 +0000 UTC m=+0.111071032 container exec_died 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:02:55 compute-0 systemd[1]: libpod-conmon-3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845.scope: Deactivated successfully.
Oct  8 19:02:55 compute-0 python3.9[136275]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:02:56 compute-0 python3.9[136427]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct  8 19:02:57 compute-0 python3.9[136593]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:02:57 compute-0 systemd[1]: Started libpod-conmon-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope.
Oct  8 19:02:57 compute-0 podman[136594]: 2025-10-08 19:02:57.763029507 +0000 UTC m=+0.077833850 container exec 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  8 19:02:57 compute-0 podman[136594]: 2025-10-08 19:02:57.794253721 +0000 UTC m=+0.109058094 container exec_died 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 19:02:57 compute-0 systemd[1]: libpod-conmon-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope: Deactivated successfully.
Oct  8 19:02:58 compute-0 python3.9[136777]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:02:58 compute-0 systemd[1]: Started libpod-conmon-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope.
Oct  8 19:02:58 compute-0 podman[136778]: 2025-10-08 19:02:58.78998571 +0000 UTC m=+0.096366361 container exec 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.build-date=20251001)
Oct  8 19:02:58 compute-0 podman[136778]: 2025-10-08 19:02:58.823287503 +0000 UTC m=+0.129668104 container exec_died 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  8 19:02:58 compute-0 systemd[1]: libpod-conmon-62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d.scope: Deactivated successfully.
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.104 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.129 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.129 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.131 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.131 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.132 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.172 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.173 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.174 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.175 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.417 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.419 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6293MB free_disk=73.4556770324707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.419 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.419 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.507 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.508 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.533 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.554 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.556 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:02:59 compute-0 nova_compute[117514]: 2025-10-08 19:02:59.556 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:02:59 compute-0 python3.9[136964]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.165 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:00 compute-0 podman[137088]: 2025-10-08 19:03:00.389460069 +0000 UTC m=+0.089615597 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:03:00 compute-0 python3.9[137136]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Oct  8 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.732 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.733 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.734 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:00 compute-0 nova_compute[117514]: 2025-10-08 19:03:00.734 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:01 compute-0 python3.9[137306]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:03:01 compute-0 systemd[1]: Started libpod-conmon-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope.
Oct  8 19:03:01 compute-0 podman[137307]: 2025-10-08 19:03:01.664927498 +0000 UTC m=+0.120427730 container exec e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:03:01 compute-0 podman[137307]: 2025-10-08 19:03:01.701362642 +0000 UTC m=+0.156862874 container exec_died e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible)
Oct  8 19:03:01 compute-0 systemd[1]: libpod-conmon-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope: Deactivated successfully.
Oct  8 19:03:02 compute-0 python3.9[137490]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:03:02 compute-0 systemd[1]: Started libpod-conmon-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope.
Oct  8 19:03:02 compute-0 podman[137491]: 2025-10-08 19:03:02.872543715 +0000 UTC m=+0.304102561 container exec e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 19:03:02 compute-0 podman[137491]: 2025-10-08 19:03:02.906359203 +0000 UTC m=+0.337918039 container exec_died e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 19:03:02 compute-0 systemd[1]: libpod-conmon-e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f.scope: Deactivated successfully.
Oct  8 19:03:03 compute-0 python3.9[137675]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:04 compute-0 python3.9[137827]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Oct  8 19:03:05 compute-0 python3.9[137992]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:03:05 compute-0 systemd[1]: Started libpod-conmon-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope.
Oct  8 19:03:05 compute-0 podman[137993]: 2025-10-08 19:03:05.508119708 +0000 UTC m=+0.103874296 container exec 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:03:05 compute-0 podman[137993]: 2025-10-08 19:03:05.546404434 +0000 UTC m=+0.142159042 container exec_died 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 19:03:05 compute-0 systemd[1]: libpod-conmon-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope: Deactivated successfully.
Oct  8 19:03:06 compute-0 python3.9[138176]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:03:06 compute-0 systemd[1]: Started libpod-conmon-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope.
Oct  8 19:03:06 compute-0 podman[138177]: 2025-10-08 19:03:06.452466614 +0000 UTC m=+0.104266537 container exec 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:03:06 compute-0 podman[138177]: 2025-10-08 19:03:06.490588736 +0000 UTC m=+0.142388619 container exec_died 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:03:06 compute-0 systemd[1]: libpod-conmon-9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213.scope: Deactivated successfully.
Oct  8 19:03:07 compute-0 python3.9[138358]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:07 compute-0 podman[138395]: 2025-10-08 19:03:07.685728955 +0000 UTC m=+0.088184586 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct  8 19:03:08 compute-0 python3.9[138530]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct  8 19:03:09 compute-0 python3.9[138696]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:03:09 compute-0 systemd[1]: Started libpod-conmon-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope.
Oct  8 19:03:09 compute-0 podman[138697]: 2025-10-08 19:03:09.399585628 +0000 UTC m=+0.312782607 container exec 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 19:03:09 compute-0 podman[138697]: 2025-10-08 19:03:09.435216497 +0000 UTC m=+0.348413396 container exec_died 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 19:03:09 compute-0 systemd[1]: libpod-conmon-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope: Deactivated successfully.
Oct  8 19:03:10 compute-0 python3.9[138880]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:03:10 compute-0 systemd[1]: Started libpod-conmon-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope.
Oct  8 19:03:10 compute-0 podman[138881]: 2025-10-08 19:03:10.435496636 +0000 UTC m=+0.106556318 container exec 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:03:10 compute-0 podman[138881]: 2025-10-08 19:03:10.473642277 +0000 UTC m=+0.144701869 container exec_died 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:03:10 compute-0 systemd[1]: libpod-conmon-9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d.scope: Deactivated successfully.
Oct  8 19:03:11 compute-0 python3.9[139064]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:12 compute-0 python3.9[139216]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct  8 19:03:12 compute-0 python3.9[139381]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:03:12 compute-0 systemd[1]: Started libpod-conmon-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope.
Oct  8 19:03:12 compute-0 podman[139382]: 2025-10-08 19:03:12.957532087 +0000 UTC m=+0.079519295 container exec 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 19:03:12 compute-0 podman[139382]: 2025-10-08 19:03:12.987102833 +0000 UTC m=+0.109090021 container exec_died 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, name=ubi9-minimal, config_id=edpm, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Oct  8 19:03:13 compute-0 systemd[1]: libpod-conmon-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope: Deactivated successfully.
Oct  8 19:03:13 compute-0 podman[139537]: 2025-10-08 19:03:13.60305913 +0000 UTC m=+0.091145458 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350)
Oct  8 19:03:13 compute-0 python3.9[139582]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  8 19:03:13 compute-0 systemd[1]: Started libpod-conmon-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope.
Oct  8 19:03:13 compute-0 podman[139588]: 2025-10-08 19:03:13.907368253 +0000 UTC m=+0.098693563 container exec 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 19:03:13 compute-0 podman[139588]: 2025-10-08 19:03:13.940063789 +0000 UTC m=+0.131389049 container exec_died 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Oct  8 19:03:13 compute-0 systemd[1]: libpod-conmon-58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2.scope: Deactivated successfully.
Oct  8 19:03:13 compute-0 podman[139604]: 2025-10-08 19:03:13.983640515 +0000 UTC m=+0.076088927 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 19:03:14 compute-0 python3.9[139791]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:15 compute-0 python3.9[139943]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:16 compute-0 python3.9[140095]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:16 compute-0 python3.9[140218]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759950195.7130418-1115-55922560681916/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:17 compute-0 python3.9[140370]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:18 compute-0 podman[140494]: 2025-10-08 19:03:18.411806433 +0000 UTC m=+0.077687373 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:03:18 compute-0 python3.9[140537]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:19 compute-0 python3.9[140622]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:19 compute-0 python3.9[140774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:20 compute-0 podman[140824]: 2025-10-08 19:03:20.113913764 +0000 UTC m=+0.060090990 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 19:03:20 compute-0 python3.9[140870]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.1hdy_udt recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:20 compute-0 podman[140994]: 2025-10-08 19:03:20.904836325 +0000 UTC m=+0.071880537 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  8 19:03:21 compute-0 python3.9[141035]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:21 compute-0 python3.9[141117]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:22 compute-0 podman[141241]: 2025-10-08 19:03:22.223815208 +0000 UTC m=+0.119222200 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 19:03:22 compute-0 python3.9[141282]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:03:23 compute-0 python3[141448]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  8 19:03:24 compute-0 python3.9[141600]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:24 compute-0 python3.9[141678]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:25 compute-0 python3.9[141830]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:25 compute-0 python3.9[141908]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:26 compute-0 python3.9[142060]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:27 compute-0 python3.9[142138]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:27 compute-0 python3.9[142290]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:28 compute-0 python3.9[142368]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:29 compute-0 python3.9[142520]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  8 19:03:29 compute-0 python3.9[142645]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759950208.58123-1240-253803914708402/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:30 compute-0 podman[142769]: 2025-10-08 19:03:30.564199927 +0000 UTC m=+0.101591527 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:03:30 compute-0 python3.9[142821]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:31 compute-0 python3.9[142973]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:03:32 compute-0 python3.9[143128]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:33 compute-0 python3.9[143280]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:03:34 compute-0 python3.9[143433]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  8 19:03:34 compute-0 python3.9[143587]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  8 19:03:35 compute-0 python3.9[143742]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  8 19:03:36 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Oct  8 19:03:36 compute-0 systemd[1]: session-11.scope: Consumed 1min 59.818s CPU time.
Oct  8 19:03:36 compute-0 systemd-logind[844]: Session 11 logged out. Waiting for processes to exit.
Oct  8 19:03:36 compute-0 systemd-logind[844]: Removed session 11.
Oct  8 19:03:38 compute-0 podman[143767]: 2025-10-08 19:03:38.691392199 +0000 UTC m=+0.104698566 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 19:03:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:03:44.223 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:03:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:03:44.223 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:03:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:03:44.223 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:03:44 compute-0 podman[143789]: 2025-10-08 19:03:44.637956794 +0000 UTC m=+0.056485817 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS)
Oct  8 19:03:44 compute-0 podman[143788]: 2025-10-08 19:03:44.653700364 +0000 UTC m=+0.068965494 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350)
Oct  8 19:03:46 compute-0 systemd[1]: Stopping User Manager for UID 1000...
Oct  8 19:03:46 compute-0 systemd[1314]: Activating special unit Exit the Session...
Oct  8 19:03:46 compute-0 systemd[1314]: Removed slice User Background Tasks Slice.
Oct  8 19:03:46 compute-0 systemd[1314]: Stopped target Main User Target.
Oct  8 19:03:46 compute-0 systemd[1314]: Stopped target Basic System.
Oct  8 19:03:46 compute-0 systemd[1314]: Stopped target Paths.
Oct  8 19:03:46 compute-0 systemd[1314]: Stopped target Sockets.
Oct  8 19:03:46 compute-0 systemd[1314]: Stopped target Timers.
Oct  8 19:03:46 compute-0 systemd[1314]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  8 19:03:46 compute-0 systemd[1314]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 19:03:46 compute-0 systemd[1314]: Closed D-Bus User Message Bus Socket.
Oct  8 19:03:46 compute-0 systemd[1314]: Stopped Create User's Volatile Files and Directories.
Oct  8 19:03:46 compute-0 systemd[1314]: Removed slice User Application Slice.
Oct  8 19:03:46 compute-0 systemd[1314]: Reached target Shutdown.
Oct  8 19:03:46 compute-0 systemd[1314]: Finished Exit the Session.
Oct  8 19:03:46 compute-0 systemd[1314]: Reached target Exit the Session.
Oct  8 19:03:46 compute-0 systemd[1]: user@1000.service: Deactivated successfully.
Oct  8 19:03:46 compute-0 systemd[1]: Stopped User Manager for UID 1000.
Oct  8 19:03:46 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/1000...
Oct  8 19:03:46 compute-0 systemd[1]: run-user-1000.mount: Deactivated successfully.
Oct  8 19:03:46 compute-0 systemd[1]: user-runtime-dir@1000.service: Deactivated successfully.
Oct  8 19:03:46 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/1000.
Oct  8 19:03:46 compute-0 systemd[1]: Removed slice User Slice of UID 1000.
Oct  8 19:03:46 compute-0 systemd[1]: user-1000.slice: Consumed 10min 35.549s CPU time.
Oct  8 19:03:48 compute-0 podman[143830]: 2025-10-08 19:03:48.667017896 +0000 UTC m=+0.085690072 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:03:49 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Oct  8 19:03:49 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct  8 19:03:49 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Oct  8 19:03:49 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct  8 19:03:50 compute-0 podman[143858]: 2025-10-08 19:03:50.701182374 +0000 UTC m=+0.113431187 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:03:51 compute-0 podman[143879]: 2025-10-08 19:03:51.639365306 +0000 UTC m=+0.061095409 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 19:03:52 compute-0 podman[143899]: 2025-10-08 19:03:52.719719676 +0000 UTC m=+0.135457015 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.811 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.812 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.813 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.813 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.976 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.977 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6417MB free_disk=73.45592880249023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.977 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:03:59 compute-0 nova_compute[117514]: 2025-10-08 19:03:59.977 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.039 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.039 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.063 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.080 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.082 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:04:00 compute-0 nova_compute[117514]: 2025-10-08 19:04:00.082 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:04:01 compute-0 nova_compute[117514]: 2025-10-08 19:04:01.082 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:04:01 compute-0 podman[143927]: 2025-10-08 19:04:01.627640846 +0000 UTC m=+0.056508348 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:04:01 compute-0 nova_compute[117514]: 2025-10-08 19:04:01.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:04:02 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:02.201 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:04:02 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:02.202 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:04:02 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:02.203 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.738 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.738 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:04:02 compute-0 nova_compute[117514]: 2025-10-08 19:04:02.739 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.240 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:04:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:04:09 compute-0 podman[143951]: 2025-10-08 19:04:09.672256146 +0000 UTC m=+0.084479987 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 19:04:15 compute-0 podman[143972]: 2025-10-08 19:04:15.682642518 +0000 UTC m=+0.081173974 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal)
Oct  8 19:04:15 compute-0 podman[143973]: 2025-10-08 19:04:15.71675427 +0000 UTC m=+0.111370127 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 19:04:19 compute-0 podman[144011]: 2025-10-08 19:04:19.638416775 +0000 UTC m=+0.058580289 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:04:21 compute-0 podman[144035]: 2025-10-08 19:04:21.643920093 +0000 UTC m=+0.066956371 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid)
Oct  8 19:04:21 compute-0 podman[144055]: 2025-10-08 19:04:21.736928232 +0000 UTC m=+0.052568683 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 19:04:23 compute-0 podman[144077]: 2025-10-08 19:04:23.700680195 +0000 UTC m=+0.116906820 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct  8 19:04:32 compute-0 podman[144103]: 2025-10-08 19:04:32.661187107 +0000 UTC m=+0.066913850 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:04:34 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Oct  8 19:04:40 compute-0 podman[144130]: 2025-10-08 19:04:40.659059148 +0000 UTC m=+0.073807329 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3)
Oct  8 19:04:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:44.224 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:04:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:44.224 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:04:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:04:44.224 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:04:46 compute-0 podman[144151]: 2025-10-08 19:04:46.669633425 +0000 UTC m=+0.083505608 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  8 19:04:46 compute-0 podman[144150]: 2025-10-08 19:04:46.670121728 +0000 UTC m=+0.086581122 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7)
Oct  8 19:04:50 compute-0 podman[144193]: 2025-10-08 19:04:50.629452395 +0000 UTC m=+0.053912720 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:04:52 compute-0 podman[144218]: 2025-10-08 19:04:52.670543534 +0000 UTC m=+0.085021369 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:04:52 compute-0 podman[144219]: 2025-10-08 19:04:52.675299486 +0000 UTC m=+0.083375774 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:04:54 compute-0 podman[144255]: 2025-10-08 19:04:54.734844546 +0000 UTC m=+0.146546949 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 19:05:00 compute-0 nova_compute[117514]: 2025-10-08 19:05:00.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:00 compute-0 nova_compute[117514]: 2025-10-08 19:05:00.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:00 compute-0 nova_compute[117514]: 2025-10-08 19:05:00.742 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.745 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.958 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.959 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6430MB free_disk=73.45976257324219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.959 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:01 compute-0 nova_compute[117514]: 2025-10-08 19:05:01.959 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.025 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.026 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.044 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.056 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.058 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:05:02 compute-0 nova_compute[117514]: 2025-10-08 19:05:02.058 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.057 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.058 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.059 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.074 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.075 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:03 compute-0 podman[144283]: 2025-10-08 19:05:03.651724363 +0000 UTC m=+0.063449983 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:05:03 compute-0 nova_compute[117514]: 2025-10-08 19:05:03.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:11 compute-0 podman[144307]: 2025-10-08 19:05:11.677316539 +0000 UTC m=+0.087484758 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  8 19:05:17 compute-0 podman[144328]: 2025-10-08 19:05:17.690786845 +0000 UTC m=+0.088887747 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:05:17 compute-0 podman[144327]: 2025-10-08 19:05:17.70835938 +0000 UTC m=+0.115414619 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350)
Oct  8 19:05:21 compute-0 podman[144368]: 2025-10-08 19:05:21.666197128 +0000 UTC m=+0.078700313 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 19:05:23 compute-0 podman[144393]: 2025-10-08 19:05:23.678810035 +0000 UTC m=+0.087381226 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 19:05:23 compute-0 podman[144394]: 2025-10-08 19:05:23.694185125 +0000 UTC m=+0.097390596 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  8 19:05:25 compute-0 podman[144430]: 2025-10-08 19:05:25.703162859 +0000 UTC m=+0.118436485 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:05:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:31.021 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:05:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:31.023 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:05:34 compute-0 podman[144458]: 2025-10-08 19:05:34.659265754 +0000 UTC m=+0.073058035 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:05:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:39.025 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:05:42 compute-0 podman[144482]: 2025-10-08 19:05:42.679555656 +0000 UTC m=+0.092844989 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 19:05:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:44.226 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:44.226 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:44.226 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:48 compute-0 podman[144503]: 2025-10-08 19:05:48.676607835 +0000 UTC m=+0.090876773 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, container_name=openstack_network_exporter, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 19:05:48 compute-0 podman[144504]: 2025-10-08 19:05:48.68356069 +0000 UTC m=+0.086806930 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.789 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.789 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.811 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.929 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.930 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.939 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:05:50 compute-0 nova_compute[117514]: 2025-10-08 19:05:50.940 2 INFO nova.compute.claims [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.041 2 DEBUG nova.compute.provider_tree [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.055 2 DEBUG nova.scheduler.client.report [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.080 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.081 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.125 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.126 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.162 2 INFO nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.185 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.271 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.273 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.274 2 INFO nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Creating image(s)#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.275 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.275 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.276 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.277 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:51 compute-0 nova_compute[117514]: 2025-10-08 19:05:51.278 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.074 2 WARNING oslo_policy.policy [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  8 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.074 2 WARNING oslo_policy.policy [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  8 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.079 2 DEBUG nova.policy [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:05:52 compute-0 podman[144545]: 2025-10-08 19:05:52.669992856 +0000 UTC m=+0.084613048 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.683 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.788 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.part --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.789 2 DEBUG nova.virt.images [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] 23cfa426-7011-4566-992d-1c7af39f70dd was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  8 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.792 2 DEBUG nova.privsep.utils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  8 19:05:52 compute-0 nova_compute[117514]: 2025-10-08 19:05:52.793 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.part /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.049 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.part /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.converted" returned: 0 in 0.257s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.054 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.151 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df.converted --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.153 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.166 2 INFO oslo.privsep.daemon [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpp8ow9d39/privsep.sock']#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.218 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Successfully created port: 82f4743a-dcdc-49f7-be61-94d565e29842 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.900 2 INFO oslo.privsep.daemon [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.766 54 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.770 54 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.773 54 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.773 54 INFO oslo.privsep.daemon [-] privsep daemon running as pid 54#033[00m
Oct  8 19:05:53 compute-0 nova_compute[117514]: 2025-10-08 19:05:53.994 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.047 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.048 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.049 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.060 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.116 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.117 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.152 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.153 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.153 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.210 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.211 2 DEBUG nova.virt.disk.api [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.212 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.273 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.275 2 DEBUG nova.virt.disk.api [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.275 2 DEBUG nova.objects.instance [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 533c431a-8ae8-4310-81dc-29285b78f93c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.291 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.291 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Ensure instance console log exists: /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.292 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.292 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:54 compute-0 nova_compute[117514]: 2025-10-08 19:05:54.292 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:54 compute-0 podman[144606]: 2025-10-08 19:05:54.678196449 +0000 UTC m=+0.077215511 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  8 19:05:54 compute-0 podman[144605]: 2025-10-08 19:05:54.694376142 +0000 UTC m=+0.104346851 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.061 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Successfully updated port: 82f4743a-dcdc-49f7-be61-94d565e29842 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.077 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.077 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.078 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.240 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.564 2 DEBUG nova.compute.manager [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.564 2 DEBUG nova.compute.manager [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing instance network info cache due to event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:05:55 compute-0 nova_compute[117514]: 2025-10-08 19:05:55.565 2 DEBUG oslo_concurrency.lockutils [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.018 2 DEBUG nova.network.neutron [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.051 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.052 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance network_info: |[{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.052 2 DEBUG oslo_concurrency.lockutils [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.052 2 DEBUG nova.network.neutron [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.055 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start _get_guest_xml network_info=[{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.069 2 WARNING nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.075 2 DEBUG nova.virt.libvirt.host [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.076 2 DEBUG nova.virt.libvirt.host [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.079 2 DEBUG nova.virt.libvirt.host [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.079 2 DEBUG nova.virt.libvirt.host [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.080 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.080 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.081 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.081 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.081 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.081 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.082 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.082 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.082 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.082 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.083 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.083 2 DEBUG nova.virt.hardware [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.126 2 DEBUG nova.privsep.utils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.128 2 DEBUG nova.virt.libvirt.vif [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-447228763',display_name='tempest-TestNetworkBasicOps-server-447228763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-447228763',id=1,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkUPXM3K1FQRSOHUI4ceK1l6cbpFonPXFALKMkZcGgnSoRiUTQsb/Q287ApBX2G3xb2VwfVQAcm0rggAGmL4bEoFJTCQrQCAGh+fp9j7aUYBxWFzZf4Ok3jDCvBVuh0yA==',key_name='tempest-TestNetworkBasicOps-1885837558',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-2r2x09q7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:05:51Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=533c431a-8ae8-4310-81dc-29285b78f93c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.129 2 DEBUG nova.network.os_vif_util [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.130 2 DEBUG nova.network.os_vif_util [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.133 2 DEBUG nova.objects.instance [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 533c431a-8ae8-4310-81dc-29285b78f93c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.151 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <uuid>533c431a-8ae8-4310-81dc-29285b78f93c</uuid>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <name>instance-00000001</name>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-447228763</nova:name>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:05:56</nova:creationTime>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:05:56 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:05:56 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:05:56 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:05:56 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:05:56 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:05:56 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:05:56 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:05:56 compute-0 nova_compute[117514]:        <nova:port uuid="82f4743a-dcdc-49f7-be61-94d565e29842">
Oct  8 19:05:56 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <system>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <entry name="serial">533c431a-8ae8-4310-81dc-29285b78f93c</entry>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <entry name="uuid">533c431a-8ae8-4310-81dc-29285b78f93c</entry>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </system>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <os>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  </os>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <features>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  </features>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.config"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:2e:6b:6c"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <target dev="tap82f4743a-dc"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/console.log" append="off"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <video>
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </video>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:05:56 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:05:56 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:05:56 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:05:56 compute-0 nova_compute[117514]: </domain>
Oct  8 19:05:56 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.153 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Preparing to wait for external event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.154 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.154 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.155 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.156 2 DEBUG nova.virt.libvirt.vif [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-447228763',display_name='tempest-TestNetworkBasicOps-server-447228763',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-447228763',id=1,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkUPXM3K1FQRSOHUI4ceK1l6cbpFonPXFALKMkZcGgnSoRiUTQsb/Q287ApBX2G3xb2VwfVQAcm0rggAGmL4bEoFJTCQrQCAGh+fp9j7aUYBxWFzZf4Ok3jDCvBVuh0yA==',key_name='tempest-TestNetworkBasicOps-1885837558',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-2r2x09q7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:05:51Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=533c431a-8ae8-4310-81dc-29285b78f93c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.157 2 DEBUG nova.network.os_vif_util [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.158 2 DEBUG nova.network.os_vif_util [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.158 2 DEBUG os_vif [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.298 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.299 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.299 2 DEBUG ovsdbapp.backend.ovs_idl [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.320 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.321 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:05:56 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.321 2 INFO oslo.privsep.daemon [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpctdqzjyn/privsep.sock']#033[00m
Oct  8 19:05:56 compute-0 podman[144648]: 2025-10-08 19:05:56.691418102 +0000 UTC m=+0.113696422 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.043 2 INFO oslo.privsep.daemon [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.907 75 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.911 75 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.914 75 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:56.914 75 INFO oslo.privsep.daemon [-] privsep daemon running as pid 75#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.247 2 DEBUG nova.network.neutron [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updated VIF entry in instance network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.248 2 DEBUG nova.network.neutron [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.266 2 DEBUG oslo_concurrency.lockutils [req-3474fb1f-ad2c-42e9-b2d3-dd50af38bc26 req-4424d32f-9f5d-4126-85c0-9cd46f5f1538 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.363 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82f4743a-dc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.364 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82f4743a-dc, col_values=(('external_ids', {'iface-id': '82f4743a-dcdc-49f7-be61-94d565e29842', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:6b:6c', 'vm-uuid': '533c431a-8ae8-4310-81dc-29285b78f93c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:05:57 compute-0 NetworkManager[1035]: <info>  [1759950357.3670] manager: (tap82f4743a-dc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.375 2 INFO os_vif [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc')#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.414 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.415 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.415 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:2e:6b:6c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.415 2 INFO nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Using config drive#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.833 2 INFO nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Creating config drive at /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.config#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.842 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp69xewfxk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:05:57 compute-0 nova_compute[117514]: 2025-10-08 19:05:57.993 2 DEBUG oslo_concurrency.processutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp69xewfxk" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:05:58 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct  8 19:05:58 compute-0 kernel: tap82f4743a-dc: entered promiscuous mode
Oct  8 19:05:58 compute-0 NetworkManager[1035]: <info>  [1759950358.0815] manager: (tap82f4743a-dc): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Oct  8 19:05:58 compute-0 ovn_controller[19759]: 2025-10-08T19:05:58Z|00027|binding|INFO|Claiming lport 82f4743a-dcdc-49f7-be61-94d565e29842 for this chassis.
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:58 compute-0 ovn_controller[19759]: 2025-10-08T19:05:58Z|00028|binding|INFO|82f4743a-dcdc-49f7-be61-94d565e29842: Claiming fa:16:3e:2e:6b:6c 10.100.0.3
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.098 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:6b:6c 10.100.0.3'], port_security=['fa:16:3e:2e:6b:6c 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd3706646-002b-4286-ab41-a86fd84e3356', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30f96b84-f723-4541-a1ae-463e873ff4a9, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=82f4743a-dcdc-49f7-be61-94d565e29842) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.100 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 82f4743a-dcdc-49f7-be61-94d565e29842 in datapath a913b285-6d0a-478e-aa24-18bb458d8f7a bound to our chassis#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.101 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a913b285-6d0a-478e-aa24-18bb458d8f7a#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.102 28643 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpwcgl_2bs/privsep.sock']#033[00m
Oct  8 19:05:58 compute-0 systemd-udevd[144702]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:05:58 compute-0 NetworkManager[1035]: <info>  [1759950358.1261] device (tap82f4743a-dc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:05:58 compute-0 NetworkManager[1035]: <info>  [1759950358.1267] device (tap82f4743a-dc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:05:58 compute-0 systemd-machined[77568]: New machine qemu-1-instance-00000001.
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:58 compute-0 ovn_controller[19759]: 2025-10-08T19:05:58Z|00029|binding|INFO|Setting lport 82f4743a-dcdc-49f7-be61-94d565e29842 ovn-installed in OVS
Oct  8 19:05:58 compute-0 ovn_controller[19759]: 2025-10-08T19:05:58Z|00030|binding|INFO|Setting lport 82f4743a-dcdc-49f7-be61-94d565e29842 up in Southbound
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:05:58 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.387 2 DEBUG nova.compute.manager [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.387 2 DEBUG oslo_concurrency.lockutils [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.388 2 DEBUG oslo_concurrency.lockutils [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.388 2 DEBUG oslo_concurrency.lockutils [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.388 2 DEBUG nova.compute.manager [req-94b4723a-a5e7-4f69-96ff-b852dc67af3e req-c17741c4-e8d4-457a-b540-cb8317abaf1b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Processing event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.735 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.739 28643 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.740 28643 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpwcgl_2bs/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.610 144726 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.617 144726 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.620 144726 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.621 144726 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144726#033[00m
Oct  8 19:05:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:58.743 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[78f73a4d-f8c8-4ebb-b6ce-e1bdd1ee2abc]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:05:58 compute-0 nova_compute[117514]: 2025-10-08 19:05:58.746 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.013 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950359.0127413, 533c431a-8ae8-4310-81dc-29285b78f93c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.013 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] VM Started (Lifecycle Event)#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.015 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.029 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.033 2 INFO nova.virt.libvirt.driver [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance spawned successfully.#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.033 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.097 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.102 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.129 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.129 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.130 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.131 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.132 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.133 2 DEBUG nova.virt.libvirt.driver [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.140 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.140 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950359.0141673, 533c431a-8ae8-4310-81dc-29285b78f93c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.141 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.205 2 INFO nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Took 7.93 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.206 2 DEBUG nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.220 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.223 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950359.0185692, 533c431a-8ae8-4310-81dc-29285b78f93c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.224 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.232 144726 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.232 144726 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.232 144726 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.260 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.264 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.300 2 INFO nova.compute.manager [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Took 8.41 seconds to build instance.#033[00m
Oct  8 19:05:59 compute-0 nova_compute[117514]: 2025-10-08 19:05:59.323 2 DEBUG oslo_concurrency.lockutils [None req-7c4d8633-7117-4060-baac-411fe1e3efc9 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.812 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ddfb92a8-9f26-449d-ba87-3b1487f8ed33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.813 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa913b285-61 in ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.816 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa913b285-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.816 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[166982ad-869b-4c00-af73-6c6ef338cb43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.819 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ff2fa2c9-beb7-44d6-a193-5cbcbde32d42]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.853 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[7138f92a-e9f9-443c-80aa-19c154659748]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.884 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3cebb73c-de61-4adb-8703-ac0a7d54f88a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:05:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:05:59.887 28643 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp23wxy_tt/privsep.sock']#033[00m
Oct  8 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.509 2 DEBUG nova.compute.manager [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.510 2 DEBUG oslo_concurrency.lockutils [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.510 2 DEBUG oslo_concurrency.lockutils [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.511 2 DEBUG oslo_concurrency.lockutils [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.511 2 DEBUG nova.compute.manager [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] No waiting events found dispatching network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:06:00 compute-0 nova_compute[117514]: 2025-10-08 19:06:00.511 2 WARNING nova.compute.manager [req-e62b17cc-b9bf-4c77-8efe-65522aed82ad req-a53561a2-a5ac-4a90-80e4-9837d516abf8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received unexpected event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.553 28643 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  8 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.555 28643 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp23wxy_tt/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  8 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.437 144740 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  8 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.444 144740 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  8 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.448 144740 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  8 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.448 144740 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144740#033[00m
Oct  8 19:06:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:00.559 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[9abbe92c-4ba0-452f-81a1-7c732f80de13]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.011 144740 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.011 144740 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.011 144740 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.576 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[82cc5cfe-37d3-4b21-8041-7c6ce15ab75b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 NetworkManager[1035]: <info>  [1759950361.5904] manager: (tapa913b285-60): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.589 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[faedad42-74a4-4c5a-93a6-6307cdc51eee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.631 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e5d935-148f-4d34-a3cb-779ef932288b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 systemd-udevd[144750]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.640 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d229c2a9-e94f-46b6-a0d4-7a3e564d0431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 NetworkManager[1035]: <info>  [1759950361.6674] device (tapa913b285-60): carrier: link connected
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.678 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[666618cc-f3fd-4286-b318-0797001b7619]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.706 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[314a3fb3-9ac6-4024-b2e3-7734946d5a73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa913b285-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:f1:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 103224, 'reachable_time': 19301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 144769, 'error': None, 'target': 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.730 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c5d49245-da07-447f-b2df-6f83ad79f103]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feff:f109'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 103224, 'tstamp': 103224}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 144770, 'error': None, 'target': 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.755 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b2365eeb-9c5e-4946-b634-bf3512b69c67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa913b285-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ff:f1:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 103224, 'reachable_time': 19301, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 144771, 'error': None, 'target': 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.798 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[409dd171-6139-4fd1-8097-ce59da49e903]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.890 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4c9eb95a-f153-4ada-b626-a7b245c976ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.893 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa913b285-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.893 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.894 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa913b285-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:01 compute-0 kernel: tapa913b285-60: entered promiscuous mode
Oct  8 19:06:01 compute-0 NetworkManager[1035]: <info>  [1759950361.8989] manager: (tapa913b285-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Oct  8 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.903 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa913b285-60, col_values=(('external_ids', {'iface-id': 'f9878aab-28ef-456a-a43a-7cacc2381b1f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:01 compute-0 ovn_controller[19759]: 2025-10-08T19:06:01Z|00031|binding|INFO|Releasing lport f9878aab-28ef-456a-a43a-7cacc2381b1f from this chassis (sb_readonly=0)
Oct  8 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.908 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a913b285-6d0a-478e-aa24-18bb458d8f7a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a913b285-6d0a-478e-aa24-18bb458d8f7a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.910 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[49711289-52ee-443c-befd-8fab8565e175]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.911 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-a913b285-6d0a-478e-aa24-18bb458d8f7a
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/a913b285-6d0a-478e-aa24-18bb458d8f7a.pid.haproxy
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID a913b285-6d0a-478e-aa24-18bb458d8f7a
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:06:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:01.913 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'env', 'PROCESS_TAG=haproxy-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a913b285-6d0a-478e-aa24-18bb458d8f7a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:06:01 compute-0 nova_compute[117514]: 2025-10-08 19:06:01.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:02 compute-0 podman[144804]: 2025-10-08 19:06:02.359540899 +0000 UTC m=+0.066390548 container create 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  8 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:02 compute-0 systemd[1]: Started libpod-conmon-3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa.scope.
Oct  8 19:06:02 compute-0 podman[144804]: 2025-10-08 19:06:02.32489971 +0000 UTC m=+0.031749399 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:06:02 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:06:02 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d18ad23441d9b789fc1d272bc7da5f2175a2c9c00ba472ac96b05443d421d00c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:06:02 compute-0 podman[144804]: 2025-10-08 19:06:02.475930446 +0000 UTC m=+0.182780125 container init 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:06:02 compute-0 podman[144804]: 2025-10-08 19:06:02.482671304 +0000 UTC m=+0.189520953 container start 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 19:06:02 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [NOTICE]   (144822) : New worker (144824) forked
Oct  8 19:06:02 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [NOTICE]   (144822) : Loading success.
Oct  8 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.750 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.753 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.753 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:06:02 compute-0 nova_compute[117514]: 2025-10-08 19:06:02.753 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:06:03 compute-0 nova_compute[117514]: 2025-10-08 19:06:03.024 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:06:03 compute-0 nova_compute[117514]: 2025-10-08 19:06:03.027 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:06:03 compute-0 nova_compute[117514]: 2025-10-08 19:06:03.028 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 19:06:03 compute-0 nova_compute[117514]: 2025-10-08 19:06:03.029 2 DEBUG nova.objects.instance [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 533c431a-8ae8-4310-81dc-29285b78f93c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.076 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.097 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.098 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.100 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.101 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.101 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.105 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.106 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.106 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.107 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.108 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.132 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.133 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.134 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.135 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.212 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:06:05 compute-0 ovn_controller[19759]: 2025-10-08T19:06:05Z|00032|binding|INFO|Releasing lport f9878aab-28ef-456a-a43a-7cacc2381b1f from this chassis (sb_readonly=0)
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2485] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/25)
Oct  8 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2501] device (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2570] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/26)
Oct  8 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2577] device (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  8 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2591] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Oct  8 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2599] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Oct  8 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2605] device (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 19:06:05 compute-0 NetworkManager[1035]: <info>  [1759950365.2609] device (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  8 19:06:05 compute-0 podman[144834]: 2025-10-08 19:06:05.278920074 +0000 UTC m=+0.094583137 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.287 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.288 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:06:05 compute-0 ovn_controller[19759]: 2025-10-08T19:06:05Z|00033|binding|INFO|Releasing lport f9878aab-28ef-456a-a43a-7cacc2381b1f from this chassis (sb_readonly=0)
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.340 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.466 2 DEBUG nova.compute.manager [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.467 2 DEBUG nova.compute.manager [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing instance network info cache due to event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.468 2 DEBUG oslo_concurrency.lockutils [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.468 2 DEBUG oslo_concurrency.lockutils [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.469 2 DEBUG nova.network.neutron [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.525 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.526 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5946MB free_disk=73.42291259765625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.527 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.527 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.664 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 533c431a-8ae8-4310-81dc-29285b78f93c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.666 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.666 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.723 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing inventories for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.769 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating ProviderTree inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.770 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.790 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing aggregate associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.813 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing trait associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.878 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.920 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updated inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.924 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.924 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.956 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:06:05 compute-0 nova_compute[117514]: 2025-10-08 19:06:05.956 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.429s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:06 compute-0 nova_compute[117514]: 2025-10-08 19:06:06.515 2 DEBUG nova.network.neutron [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updated VIF entry in instance network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:06:06 compute-0 nova_compute[117514]: 2025-10-08 19:06:06.516 2 DEBUG nova.network.neutron [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:06:06 compute-0 nova_compute[117514]: 2025-10-08 19:06:06.535 2 DEBUG oslo_concurrency.lockutils [req-90b99b33-b76a-4c31-82fc-5c145b98c9c6 req-ea303629-57ee-47d1-a093-2509ad8a9aba bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:06:07 compute-0 nova_compute[117514]: 2025-10-08 19:06:07.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.592 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}151d582867e0a6380c4d0d029ac59b6d50f43a9bf2fdcced1ca2054ddb79aeff" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.662 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Wed, 08 Oct 2025 19:06:08 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-175e4ff3-0db0-45f6-9e8e-4b14d7e7d211 x-openstack-request-id: req-175e4ff3-0db0-45f6-9e8e-4b14d7e7d211 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.662 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "0e642ddb-c06b-4314-8c06-76ae32c14bd7", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/0e642ddb-c06b-4314-8c06-76ae32c14bd7"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/0e642ddb-c06b-4314-8c06-76ae32c14bd7"}]}, {"id": "e8a148fc-4419-4813-98ff-a17e2a95609e", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.662 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-175e4ff3-0db0-45f6-9e8e-4b14d7e7d211 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.664 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}151d582867e0a6380c4d0d029ac59b6d50f43a9bf2fdcced1ca2054ddb79aeff" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.717 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Wed, 08 Oct 2025 19:06:08 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-ab2f69df-ec97-439d-8d8a-9211b5334b9c x-openstack-request-id: req-ab2f69df-ec97-439d-8d8a-9211b5334b9c _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.717 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "e8a148fc-4419-4813-98ff-a17e2a95609e", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.717 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/e8a148fc-4419-4813-98ff-a17e2a95609e used request id req-ab2f69df-ec97-439d-8d8a-9211b5334b9c request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.719 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'name': 'tempest-TestNetworkBasicOps-server-447228763', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'hostId': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.719 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.719 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.719 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>]
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.721 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.721 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.721 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>]
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.721 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.724 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 533c431a-8ae8-4310-81dc-29285b78f93c / tap82f4743a-dc inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.724 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35907205-2001-4298-b707-b7c15bfc8a4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.722071', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd901c0d6-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '814dcd68873ab01e1c8e1a3c4e1eb53c88bdd2bfc5c9a292bc08991897fe2428'}]}, 'timestamp': '2025-10-08 19:06:08.725746', '_unique_id': '3df6988ba98d47dd86a91eee91cb2b1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.733 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.745 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.756 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.756 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2132c7bf-c3db-4001-874b-20b74769015f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.745917', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9068698-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '485e117a1b94a71ef7761fe483f3f663b00fb582d0601b665bda645a7603bb5a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.745917', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9069700-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '6ecc4cc1becb660186a0f71bf1d43249562cc9002cd33ef7e5bfce8ffa44d0db'}]}, 'timestamp': '2025-10-08 19:06:08.757271', '_unique_id': '7f4736544fec4b9981d41f908e33d284'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.758 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.759 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.780 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.781 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f810ccde-87fc-49f2-9ee2-be573484263e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.759323', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd90a3a40-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '91541f0b9023891debd66cf2b872c7d16c3f815ad3dccacd0672659312253e8c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.759323', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd90a49c2-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '5afca42953fd269c595897721dcfb47a5d6907f6a5d5ad022954cbf525a35663'}]}, 'timestamp': '2025-10-08 19:06:08.781507', '_unique_id': '10912d924bbf4eb396e077f15e20a4d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.782 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.783 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.783 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.latency volume: 515933499 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.783 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.latency volume: 2839399 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d9c83e1-911f-4754-a85f-50bea97b7f01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 515933499, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.783477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd90aa322-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'f3dbc3eec616903c6e6872994785cab032060c184cd3eb3a9b4b1c65db8bf6df'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2839399, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.783477', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd90aaf8e-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'f31722044e2ef5eee98573313da3f03911db9834120b245e00e82a9c70668822'}]}, 'timestamp': '2025-10-08 19:06:08.784104', '_unique_id': '67fdfb1addc3410aa064363c638d7316'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.784 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.785 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.785 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79590750-27de-497c-bf98-07408ab21177', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.785776', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90afeb2-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'a05d8b2a1648829827856429737c34c32db6c705d675d9db52170f9b4e11c57a'}]}, 'timestamp': '2025-10-08 19:06:08.786165', '_unique_id': '95af96018ac9476086b208122da5511d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.786 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.787 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.787 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.788 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17849020-0a83-4f5a-86c9-2960de6e9b80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.787810', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd90b4f70-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'f7c86bb0cceec6358c9a6bb864979b95246c3d6c38380c9d66071d9532454c3b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.787810', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd90b5cb8-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '4942e84989de3b61b0c113c3e33b33e7a6068cef98fba1ba5695a9f416511a81'}]}, 'timestamp': '2025-10-08 19:06:08.788542', '_unique_id': '71f33555e38d4a0bb97fe09ac0ffaa06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.789 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.790 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.790 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '582234b2-548b-4909-9b17-093335ee7e5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.790401', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90bb2d0-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '4711387174c74737c60eeecbc3af46c5c0dfe110f4cdca85da8cc4403f022b8e'}]}, 'timestamp': '2025-10-08 19:06:08.790766', '_unique_id': 'd4b67bd24ebe426bb00a8d22525f6fa3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.791 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.792 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.792 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f508fd0-f7c3-439e-95c2-150604a99a95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.792403', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90c01c2-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '3b375ef2e2382ed9665fa77c67875eaeadf7e6949d4d910b043e18a9d769ffe5'}]}, 'timestamp': '2025-10-08 19:06:08.792826', '_unique_id': '012fc97dabd14fd199a4f622aff92514'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.793 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.794 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.794 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '802f9d92-874a-479d-b896-b9e3177c04d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.794612', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90c55f0-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'd2f46a9f3708a1889b8c680bf39dfa7e70d670e97ee86b1a8ac4d427fe29759f'}]}, 'timestamp': '2025-10-08 19:06:08.794958', '_unique_id': '7f7a852476244d32be286498f63b90a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.795 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.796 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.796 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ab75317-c722-4360-a33b-fef661f0a0d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.796468', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90c9e52-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'b7f5a3a199d1274675072338663f2d4c5f6a4d0f4d83aa1c7287a4195ceb5f0b'}]}, 'timestamp': '2025-10-08 19:06:08.796780', '_unique_id': 'aa6b59e9f0eb449cac8fdbdd6bdc2d8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.797 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>]
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.798 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36864609-ae67-4e37-8cc4-34dc4c01d61c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.798896', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd90cfd48-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'ad5eb97330b664aae4012415f1ac95c9d92817386fa3aa9100526665e5091283'}]}, 'timestamp': '2025-10-08 19:06:08.799222', '_unique_id': 'ad00028cc2d9402984125e0ea6ef5c5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.799 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.800 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.801 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.801 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-447228763>]
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.801 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.818 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/cpu volume: 9340000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3f4157d-bf78-4ce5-b3fb-c457622aa95f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9340000000, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'timestamp': '2025-10-08T19:06:08.801403', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd90ff836-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.447065844, 'message_signature': 'a975082fcbfba5115fb76843583f1cc24d35464ebbec455a8c631a00489f694c'}]}, 'timestamp': '2025-10-08 19:06:08.818815', '_unique_id': '938cb870f7ba469b815c1a55aac58051'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.820 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.821 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.821 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '802bd66e-c197-451a-8b75-e8836fadce24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.821129', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd9106258-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '7d4dacd3fb4947b3244105af2d95b57145ff52656f4373da2e5f5f2d8dd2275c'}]}, 'timestamp': '2025-10-08 19:06:08.821511', '_unique_id': '8c247212d1b74453ab468e38ab318b79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.822 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.823 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.823 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.823 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84afa966-8d57-4fb6-b0d6-4dcc6b3fbcf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.823353', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd910b99c-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'bb952d22a29a99e42c486397e1a76a0eb2298e0268ac8622c6edaa101f1fb412'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.823353', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd910c4be-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '61c6664613ca96edf172e080fc6eceaa737d6173da56c3d82c549ce63468b56d'}]}, 'timestamp': '2025-10-08 19:06:08.824008', '_unique_id': '3fffd89452ff4866bbb5ebacd1473578'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.824 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.825 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.825 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.826 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48d41bb4-f218-473f-bcaf-5101e7392a41', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.825802', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9111b08-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '622529d7be731e9f047a489dfe2cf06b8c9fb498bf62b0a597ffc4a43e82a8dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.825802', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd911286e-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'c9fa407418210e98d3f3402b72d383972a050fb1a64ee09176b2f3655e9db99c'}]}, 'timestamp': '2025-10-08 19:06:08.826519', '_unique_id': '56e481ee21544516bdc6775327afbd0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.827 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.828 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.828 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2fa4bb0-9907-4761-b285-709c2e25b9d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.828369', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd9117e86-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': '65b1907fd40c52876db58cd4d145605f304b77a40ef0a42d64831927f6305a9f'}]}, 'timestamp': '2025-10-08 19:06:08.828774', '_unique_id': 'f72eb765c67f4120b11bdf607c123840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.829 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.830 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.830 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cfe6867-8cf2-4b99-808a-06a2b88b4807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000001-533c431a-8ae8-4310-81dc-29285b78f93c-tap82f4743a-dc', 'timestamp': '2025-10-08T19:06:08.830536', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'tap82f4743a-dc', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:2e:6b:6c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap82f4743a-dc'}, 'message_id': 'd911d156-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.351413058, 'message_signature': 'c99aaf0f364bf07a400672550f31d61a9d8970469b104864c2c07699681a8149'}]}, 'timestamp': '2025-10-08 19:06:08.830882', '_unique_id': '784d032244604e1b8b17de2cf3dc3339'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.831 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.832 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.832 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.832 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4b3963e-9bba-41a2-9917-e1d1c6ecf47a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.832429', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd9121b34-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': 'f05cd9e5a3c9df1b818769037658a10f3e3c7cd2f07b9983f3a362c2d1eec8aa'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.832429', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd91227fa-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '675904e550859f9139f8817b21f244c47655534340fd0408280571c30d2db36b'}]}, 'timestamp': '2025-10-08 19:06:08.833056', '_unique_id': '732ee80b83fc4baca367eb12ee86e1b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.833 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.834 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.834 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a9ef39-cc60-4708-a906-b38c87ef002e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'timestamp': '2025-10-08T19:06:08.834662', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd9127520-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.447065844, 'message_signature': '886d17b0d1d51e2945633ada53176f67cd54da51f803c1eb9b0c828044b67747'}]}, 'timestamp': '2025-10-08 19:06:08.835057', '_unique_id': 'd4a942374c93446094d6e6046b5c8f35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.835 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.836 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.837 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.837 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7686e3b6-0e94-4eb1-804d-478f989c60b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.837057', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd912d196-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '7189b095546912ec9fd682df85ec703f69f371680def042d7cdb768cc8a4bcfc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.837057', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd912e0f0-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.375248515, 'message_signature': '748cca2cc3b444f4437be6a8baa8c9a84939772b7adc4f536820bf1bac1107c8'}]}, 'timestamp': '2025-10-08 19:06:08.837834', '_unique_id': 'a84a206ebacf421a9565e4c718cb7824'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.838 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.839 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.839 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.839 12 DEBUG ceilometer.compute.pollsters [-] 533c431a-8ae8-4310-81dc-29285b78f93c/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9f6a2dfd-8fa9-4050-ba35-1f77367cf81c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-vda', 'timestamp': '2025-10-08T19:06:08.839522', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd913305a-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': '04c5cfb6da0cac461baaf555918673a8740f269195ae4298a2b0def01cbb7c36'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '533c431a-8ae8-4310-81dc-29285b78f93c-sda', 'timestamp': '2025-10-08T19:06:08.839522', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-447228763', 'name': 'instance-00000001', 'instance_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd9133f14-a479-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1039.38863171, 'message_signature': 'a408eee8ea995999312433ccfa8bc84cba23b2e280c561ded213ac73830129a6'}]}, 'timestamp': '2025-10-08 19:06:08.840252', '_unique_id': 'b79882dccee5413c99f63ee4bbf57122'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:06:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:06:08.841 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:06:10 compute-0 nova_compute[117514]: 2025-10-08 19:06:10.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:11 compute-0 ovn_controller[19759]: 2025-10-08T19:06:11Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:6b:6c 10.100.0.3
Oct  8 19:06:11 compute-0 ovn_controller[19759]: 2025-10-08T19:06:11Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:6b:6c 10.100.0.3
Oct  8 19:06:12 compute-0 nova_compute[117514]: 2025-10-08 19:06:12.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:13 compute-0 podman[144886]: 2025-10-08 19:06:13.656084282 +0000 UTC m=+0.078392694 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:06:15 compute-0 nova_compute[117514]: 2025-10-08 19:06:15.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:17 compute-0 nova_compute[117514]: 2025-10-08 19:06:17.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:18 compute-0 nova_compute[117514]: 2025-10-08 19:06:18.034 2 INFO nova.compute.manager [None req-ba8ce6c8-3274-4373-9943-af7f439b2e3b efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Get console output#033[00m
Oct  8 19:06:18 compute-0 nova_compute[117514]: 2025-10-08 19:06:18.161 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:06:19 compute-0 podman[144907]: 2025-10-08 19:06:19.674103045 +0000 UTC m=+0.073486097 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:06:19 compute-0 podman[144906]: 2025-10-08 19:06:19.706752818 +0000 UTC m=+0.115147002 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 19:06:20 compute-0 nova_compute[117514]: 2025-10-08 19:06:20.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:22 compute-0 nova_compute[117514]: 2025-10-08 19:06:22.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:23 compute-0 podman[144945]: 2025-10-08 19:06:23.66632635 +0000 UTC m=+0.081844779 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:06:25 compute-0 nova_compute[117514]: 2025-10-08 19:06:25.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:25 compute-0 podman[144969]: 2025-10-08 19:06:25.6337282 +0000 UTC m=+0.052210339 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:06:25 compute-0 podman[144970]: 2025-10-08 19:06:25.664769491 +0000 UTC m=+0.066769167 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:06:27 compute-0 nova_compute[117514]: 2025-10-08 19:06:27.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:27 compute-0 podman[145008]: 2025-10-08 19:06:27.683151723 +0000 UTC m=+0.110399189 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.394 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.395 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.412 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.496 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.497 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.506 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.506 2 INFO nova.compute.claims [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.625 2 DEBUG nova.compute.provider_tree [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.650 2 DEBUG nova.scheduler.client.report [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.687 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.688 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.738 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.739 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.761 2 INFO nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.778 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.876 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.878 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.879 2 INFO nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Creating image(s)#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.880 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.880 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.881 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.904 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.995 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.996 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:29 compute-0 nova_compute[117514]: 2025-10-08 19:06:29.997 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.012 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.035 2 DEBUG nova.policy [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.081 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.082 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.127 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.131 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.132 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.216 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.218 2 DEBUG nova.virt.disk.api [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.219 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.311 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.313 2 DEBUG nova.virt.disk.api [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.313 2 DEBUG nova.objects.instance [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 432f298f-78dd-4e9e-9ee4-279c2bc544c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.330 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.330 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Ensure instance console log exists: /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.331 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.332 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.333 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:30 compute-0 nova_compute[117514]: 2025-10-08 19:06:30.607 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Successfully created port: 41ab28a1-9254-46a6-97fc-2220fe30eccd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.555 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Successfully updated port: 41ab28a1-9254-46a6-97fc-2220fe30eccd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.573 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.573 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.574 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.635 2 DEBUG nova.compute.manager [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-changed-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.635 2 DEBUG nova.compute.manager [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Refreshing instance network info cache due to event network-changed-41ab28a1-9254-46a6-97fc-2220fe30eccd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.636 2 DEBUG oslo_concurrency.lockutils [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:06:31 compute-0 nova_compute[117514]: 2025-10-08 19:06:31.696 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:06:32 compute-0 nova_compute[117514]: 2025-10-08 19:06:32.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.043 2 DEBUG nova.network.neutron [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Updating instance_info_cache with network_info: [{"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.061 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.062 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance network_info: |[{"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.062 2 DEBUG oslo_concurrency.lockutils [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.063 2 DEBUG nova.network.neutron [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Refreshing network info cache for port 41ab28a1-9254-46a6-97fc-2220fe30eccd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.066 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start _get_guest_xml network_info=[{"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.072 2 WARNING nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.079 2 DEBUG nova.virt.libvirt.host [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.080 2 DEBUG nova.virt.libvirt.host [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.083 2 DEBUG nova.virt.libvirt.host [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.084 2 DEBUG nova.virt.libvirt.host [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.085 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.085 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.085 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.086 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.086 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.086 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.086 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.087 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.087 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.087 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.088 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.088 2 DEBUG nova.virt.hardware [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.092 2 DEBUG nova.virt.libvirt.vif [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389250187',display_name='tempest-TestNetworkBasicOps-server-389250187',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389250187',id=2,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAad1Pe21vvEy0SgHMu1VF6n4SO9ujKCPB0SgACJ8nvOuhW/VjCSPOSETWk3+gFjb/KHaSwvZLGtfcSFz4SkdC0dg68nGstzwyBghc627R2c2cxKu7YHJFmDoK+RJ/yIQg==',key_name='tempest-TestNetworkBasicOps-1028595893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-e7tbigki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:06:29Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=432f298f-78dd-4e9e-9ee4-279c2bc544c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.092 2 DEBUG nova.network.os_vif_util [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.093 2 DEBUG nova.network.os_vif_util [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.094 2 DEBUG nova.objects.instance [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 432f298f-78dd-4e9e-9ee4-279c2bc544c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.107 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <uuid>432f298f-78dd-4e9e-9ee4-279c2bc544c1</uuid>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <name>instance-00000002</name>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-389250187</nova:name>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:06:35</nova:creationTime>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:06:35 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:06:35 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:06:35 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:06:35 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:06:35 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:06:35 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:06:35 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:06:35 compute-0 nova_compute[117514]:        <nova:port uuid="41ab28a1-9254-46a6-97fc-2220fe30eccd">
Oct  8 19:06:35 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <system>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <entry name="serial">432f298f-78dd-4e9e-9ee4-279c2bc544c1</entry>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <entry name="uuid">432f298f-78dd-4e9e-9ee4-279c2bc544c1</entry>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </system>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <os>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  </os>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <features>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  </features>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.config"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:a4:ab:8d"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <target dev="tap41ab28a1-92"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/console.log" append="off"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <video>
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </video>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:06:35 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:06:35 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:06:35 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:06:35 compute-0 nova_compute[117514]: </domain>
Oct  8 19:06:35 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.109 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Preparing to wait for external event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.109 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.110 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.110 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.112 2 DEBUG nova.virt.libvirt.vif [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389250187',display_name='tempest-TestNetworkBasicOps-server-389250187',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389250187',id=2,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAad1Pe21vvEy0SgHMu1VF6n4SO9ujKCPB0SgACJ8nvOuhW/VjCSPOSETWk3+gFjb/KHaSwvZLGtfcSFz4SkdC0dg68nGstzwyBghc627R2c2cxKu7YHJFmDoK+RJ/yIQg==',key_name='tempest-TestNetworkBasicOps-1028595893',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-e7tbigki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:06:29Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=432f298f-78dd-4e9e-9ee4-279c2bc544c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.112 2 DEBUG nova.network.os_vif_util [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.113 2 DEBUG nova.network.os_vif_util [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.114 2 DEBUG os_vif [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.115 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.116 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.116 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.121 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap41ab28a1-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.122 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap41ab28a1-92, col_values=(('external_ids', {'iface-id': '41ab28a1-9254-46a6-97fc-2220fe30eccd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:ab:8d', 'vm-uuid': '432f298f-78dd-4e9e-9ee4-279c2bc544c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:35 compute-0 NetworkManager[1035]: <info>  [1759950395.1258] manager: (tap41ab28a1-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.135 2 INFO os_vif [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92')#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.199 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.200 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.200 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:a4:ab:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.201 2 INFO nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Using config drive#033[00m
Oct  8 19:06:35 compute-0 nova_compute[117514]: 2025-10-08 19:06:35.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:35 compute-0 podman[145051]: 2025-10-08 19:06:35.673824007 +0000 UTC m=+0.085424003 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.182 2 INFO nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Creating config drive at /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.config#033[00m
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.186 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_7kqzzqc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.323 2 DEBUG oslo_concurrency.processutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_7kqzzqc" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.4017] manager: (tap41ab28a1-92): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Oct  8 19:06:36 compute-0 kernel: tap41ab28a1-92: entered promiscuous mode
Oct  8 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00034|binding|INFO|Claiming lport 41ab28a1-9254-46a6-97fc-2220fe30eccd for this chassis.
Oct  8 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00035|binding|INFO|41ab28a1-9254-46a6-97fc-2220fe30eccd: Claiming fa:16:3e:a4:ab:8d 10.100.0.25
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.421 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ab:8d 10.100.0.25'], port_security=['fa:16:3e:a4:ab:8d 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '432f298f-78dd-4e9e-9ee4-279c2bc544c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b1b038d5-57f5-4b2c-9de0-90d7e6862c10', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80124168-1b37-4a7c-9765-130e2be44549, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=41ab28a1-9254-46a6-97fc-2220fe30eccd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.423 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 41ab28a1-9254-46a6-97fc-2220fe30eccd in datapath 3f19211d-1888-42c2-a8ff-1de7bc4f9219 bound to our chassis#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.424 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f19211d-1888-42c2-a8ff-1de7bc4f9219#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.440 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[88396f44-3558-4413-a7a6-76b6a51ca2bc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.441 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f19211d-11 in ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.443 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f19211d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.443 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b8c28486-dbc7-4676-bbc5-5bd53497e6ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.444 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[2c6c4160-8025-4612-885d-276c70eeaf17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 systemd-machined[77568]: New machine qemu-2-instance-00000002.
Oct  8 19:06:36 compute-0 systemd-udevd[145095]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.4672] device (tap41ab28a1-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.4704] device (tap41ab28a1-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.470 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[edd2e2c5-1859-40ff-9d86-e580d560130a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Oct  8 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00036|binding|INFO|Setting lport 41ab28a1-9254-46a6-97fc-2220fe30eccd ovn-installed in OVS
Oct  8 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00037|binding|INFO|Setting lport 41ab28a1-9254-46a6-97fc-2220fe30eccd up in Southbound
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.505 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3f28e896-618a-4c56-81c3-2a4a161f7aa8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.538 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[7872bd47-4479-48f0-a849-9135d4168694]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.543 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c807b19e-02c3-4857-b873-b74ec07c649c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.5454] manager: (tap3f19211d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.577 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[3c9c0af3-a790-4327-b448-b188d34b4443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.580 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[a64e6b3d-d67d-4147-ab2b-3afb4bace699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.6073] device (tap3f19211d-10): carrier: link connected
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.613 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[bbb40408-b3c3-4f29-89a1-7f4eb0ec7f6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.631 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d38d5944-fe21-4790-a3f3-7d6cc27ed853]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f19211d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:4d:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 106718, 'reachable_time': 16095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 145127, 'error': None, 'target': 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.648 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[bda7f2f4-ef6f-412c-9407-1bb43816aa0f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5a:4d14'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 106718, 'tstamp': 106718}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 145128, 'error': None, 'target': 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.665 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[10cf1b51-d01c-4d1f-9ec4-71df8b6abb9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f19211d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:4d:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 106718, 'reachable_time': 16095, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 145129, 'error': None, 'target': 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.698 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f3cbc4e6-2828-4f3d-a530-5768b3e8310c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.789 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1840d3ed-30c1-40cd-bd4c-0101c1148d8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.792 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f19211d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.792 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.793 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f19211d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:36 compute-0 kernel: tap3f19211d-10: entered promiscuous mode
Oct  8 19:06:36 compute-0 NetworkManager[1035]: <info>  [1759950396.7970] manager: (tap3f19211d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.800 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f19211d-10, col_values=(('external_ids', {'iface-id': '794073fc-ca71-4a94-857f-c3e735aa1420'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:36 compute-0 ovn_controller[19759]: 2025-10-08T19:06:36Z|00038|binding|INFO|Releasing lport 794073fc-ca71-4a94-857f-c3e735aa1420 from this chassis (sb_readonly=0)
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:36 compute-0 nova_compute[117514]: 2025-10-08 19:06:36.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.828 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f19211d-1888-42c2-a8ff-1de7bc4f9219.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f19211d-1888-42c2-a8ff-1de7bc4f9219.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.829 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[71f8d7f2-a57d-4c00-9239-57cc019cb22a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.830 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-3f19211d-1888-42c2-a8ff-1de7bc4f9219
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/3f19211d-1888-42c2-a8ff-1de7bc4f9219.pid.haproxy
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID 3f19211d-1888-42c2-a8ff-1de7bc4f9219
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:06:36 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:36.831 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'env', 'PROCESS_TAG=haproxy-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f19211d-1888-42c2-a8ff-1de7bc4f9219.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.260 2 DEBUG nova.compute.manager [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.262 2 DEBUG oslo_concurrency.lockutils [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.263 2 DEBUG oslo_concurrency.lockutils [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.264 2 DEBUG oslo_concurrency.lockutils [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.264 2 DEBUG nova.compute.manager [req-ab570acd-d15f-4173-b8af-e954624565c4 req-9007d8fb-d184-4bd8-93eb-3dd9cf487754 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Processing event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:06:37 compute-0 podman[145169]: 2025-10-08 19:06:37.279423324 +0000 UTC m=+0.087014778 container create 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 19:06:37 compute-0 podman[145169]: 2025-10-08 19:06:37.222967024 +0000 UTC m=+0.030558528 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:06:37 compute-0 systemd[1]: Started libpod-conmon-2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9.scope.
Oct  8 19:06:37 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:06:37 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f952153b9c075c928f3a4048d93d2e597dc86bce3878df25663d6f266130ce7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.377 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.379 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950397.3764095, 432f298f-78dd-4e9e-9ee4-279c2bc544c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.379 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] VM Started (Lifecycle Event)#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.382 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.386 2 INFO nova.virt.libvirt.driver [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance spawned successfully.#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.386 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.396 2 DEBUG nova.network.neutron [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Updated VIF entry in instance network info cache for port 41ab28a1-9254-46a6-97fc-2220fe30eccd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.397 2 DEBUG nova.network.neutron [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Updating instance_info_cache with network_info: [{"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:06:37 compute-0 podman[145169]: 2025-10-08 19:06:37.397599435 +0000 UTC m=+0.205190959 container init 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 19:06:37 compute-0 podman[145169]: 2025-10-08 19:06:37.404151893 +0000 UTC m=+0.211743347 container start 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  8 19:06:37 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [NOTICE]   (145188) : New worker (145190) forked
Oct  8 19:06:37 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [NOTICE]   (145188) : Loading success.
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.432 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.436 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.481 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.482 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950397.3766084, 432f298f-78dd-4e9e-9ee4-279c2bc544c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.482 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.485 2 DEBUG oslo_concurrency.lockutils [req-cb7578e8-60fb-4c89-b616-ba692ab08795 req-da88d31b-49e2-4ee5-b7f5-8e9a8d101196 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-432f298f-78dd-4e9e-9ee4-279c2bc544c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.491 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.492 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.492 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.493 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.493 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.493 2 DEBUG nova.virt.libvirt.driver [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.498 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.501 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950397.3810613, 432f298f-78dd-4e9e-9ee4-279c2bc544c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.501 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.524 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.528 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.548 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.560 2 INFO nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Took 7.68 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.561 2 DEBUG nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.645 2 INFO nova.compute.manager [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Took 8.18 seconds to build instance.#033[00m
Oct  8 19:06:37 compute-0 nova_compute[117514]: 2025-10-08 19:06:37.664 2 DEBUG oslo_concurrency.lockutils [None req-06e057b2-b7d7-4101-87f0-3a8a4556b3a6 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.347 2 DEBUG nova.compute.manager [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.348 2 DEBUG oslo_concurrency.lockutils [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.348 2 DEBUG oslo_concurrency.lockutils [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.349 2 DEBUG oslo_concurrency.lockutils [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.349 2 DEBUG nova.compute.manager [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] No waiting events found dispatching network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:06:39 compute-0 nova_compute[117514]: 2025-10-08 19:06:39.350 2 WARNING nova.compute.manager [req-6cfb2163-bb22-46bd-b4b8-df981f56858f req-b59014ed-836e-4321-8a59-e516b4ef4969 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received unexpected event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd for instance with vm_state active and task_state None.#033[00m
Oct  8 19:06:40 compute-0 nova_compute[117514]: 2025-10-08 19:06:40.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:40 compute-0 nova_compute[117514]: 2025-10-08 19:06:40.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:42 compute-0 nova_compute[117514]: 2025-10-08 19:06:42.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:42 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:42.444 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:06:42 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:42.447 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:06:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:44.226 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:44.227 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:44.228 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:44 compute-0 podman[145199]: 2025-10-08 19:06:44.704633089 +0000 UTC m=+0.114126976 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 19:06:45 compute-0 nova_compute[117514]: 2025-10-08 19:06:45.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:45 compute-0 nova_compute[117514]: 2025-10-08 19:06:45.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:49 compute-0 ovn_controller[19759]: 2025-10-08T19:06:49Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a4:ab:8d 10.100.0.25
Oct  8 19:06:49 compute-0 ovn_controller[19759]: 2025-10-08T19:06:49Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:ab:8d 10.100.0.25
Oct  8 19:06:49 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:49.450 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:50 compute-0 nova_compute[117514]: 2025-10-08 19:06:50.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:50 compute-0 nova_compute[117514]: 2025-10-08 19:06:50.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:50 compute-0 podman[145228]: 2025-10-08 19:06:50.697978572 +0000 UTC m=+0.110521463 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct  8 19:06:50 compute-0 podman[145227]: 2025-10-08 19:06:50.698045984 +0000 UTC m=+0.116467774 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Oct  8 19:06:54 compute-0 podman[145265]: 2025-10-08 19:06:54.67490662 +0000 UTC m=+0.086161813 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 19:06:55 compute-0 nova_compute[117514]: 2025-10-08 19:06:55.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:55 compute-0 nova_compute[117514]: 2025-10-08 19:06:55.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:56 compute-0 podman[145290]: 2025-10-08 19:06:56.671512838 +0000 UTC m=+0.084151756 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid)
Oct  8 19:06:56 compute-0 podman[145291]: 2025-10-08 19:06:56.6802778 +0000 UTC m=+0.080792720 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.346 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.347 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.348 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.348 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.349 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.351 2 INFO nova.compute.manager [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Terminating instance#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.353 2 DEBUG nova.compute.manager [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:06:58 compute-0 kernel: tap41ab28a1-92 (unregistering): left promiscuous mode
Oct  8 19:06:58 compute-0 NetworkManager[1035]: <info>  [1759950418.3824] device (tap41ab28a1-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:06:58 compute-0 ovn_controller[19759]: 2025-10-08T19:06:58Z|00039|binding|INFO|Releasing lport 41ab28a1-9254-46a6-97fc-2220fe30eccd from this chassis (sb_readonly=0)
Oct  8 19:06:58 compute-0 ovn_controller[19759]: 2025-10-08T19:06:58Z|00040|binding|INFO|Setting lport 41ab28a1-9254-46a6-97fc-2220fe30eccd down in Southbound
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:58 compute-0 ovn_controller[19759]: 2025-10-08T19:06:58Z|00041|binding|INFO|Removing iface tap41ab28a1-92 ovn-installed in OVS
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.455 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:ab:8d 10.100.0.25'], port_security=['fa:16:3e:a4:ab:8d 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '432f298f-78dd-4e9e-9ee4-279c2bc544c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b1b038d5-57f5-4b2c-9de0-90d7e6862c10', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80124168-1b37-4a7c-9765-130e2be44549, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=41ab28a1-9254-46a6-97fc-2220fe30eccd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.457 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 41ab28a1-9254-46a6-97fc-2220fe30eccd in datapath 3f19211d-1888-42c2-a8ff-1de7bc4f9219 unbound from our chassis#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.458 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f19211d-1888-42c2-a8ff-1de7bc4f9219, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.459 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[39b73246-bd46-4862-b6a7-201443236285]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.459 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 namespace which is not needed anymore#033[00m
Oct  8 19:06:58 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Oct  8 19:06:58 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 12.536s CPU time.
Oct  8 19:06:58 compute-0 systemd-machined[77568]: Machine qemu-2-instance-00000002 terminated.
Oct  8 19:06:58 compute-0 podman[145328]: 2025-10-08 19:06:58.595295305 +0000 UTC m=+0.154964368 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.628 2 INFO nova.virt.libvirt.driver [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Instance destroyed successfully.#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.629 2 DEBUG nova.objects.instance [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 432f298f-78dd-4e9e-9ee4-279c2bc544c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.648 2 DEBUG nova.virt.libvirt.vif [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:06:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-389250187',display_name='tempest-TestNetworkBasicOps-server-389250187',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-389250187',id=2,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAad1Pe21vvEy0SgHMu1VF6n4SO9ujKCPB0SgACJ8nvOuhW/VjCSPOSETWk3+gFjb/KHaSwvZLGtfcSFz4SkdC0dg68nGstzwyBghc627R2c2cxKu7YHJFmDoK+RJ/yIQg==',key_name='tempest-TestNetworkBasicOps-1028595893',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:06:37Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-e7tbigki',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:06:37Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=432f298f-78dd-4e9e-9ee4-279c2bc544c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.649 2 DEBUG nova.network.os_vif_util [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "address": "fa:16:3e:a4:ab:8d", "network": {"id": "3f19211d-1888-42c2-a8ff-1de7bc4f9219", "bridge": "br-int", "label": "tempest-network-smoke--1824442985", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap41ab28a1-92", "ovs_interfaceid": "41ab28a1-9254-46a6-97fc-2220fe30eccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.649 2 DEBUG nova.network.os_vif_util [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.650 2 DEBUG os_vif [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.653 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap41ab28a1-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [NOTICE]   (145188) : haproxy version is 2.8.14-c23fe91
Oct  8 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [NOTICE]   (145188) : path to executable is /usr/sbin/haproxy
Oct  8 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [WARNING]  (145188) : Exiting Master process...
Oct  8 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [WARNING]  (145188) : Exiting Master process...
Oct  8 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [ALERT]    (145188) : Current worker (145190) exited with code 143 (Terminated)
Oct  8 19:06:58 compute-0 neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219[145184]: [WARNING]  (145188) : All workers exited. Exiting... (0)
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.661 2 INFO os_vif [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:ab:8d,bridge_name='br-int',has_traffic_filtering=True,id=41ab28a1-9254-46a6-97fc-2220fe30eccd,network=Network(3f19211d-1888-42c2-a8ff-1de7bc4f9219),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap41ab28a1-92')#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.662 2 INFO nova.virt.libvirt.driver [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Deleting instance files /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1_del#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.663 2 INFO nova.virt.libvirt.driver [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Deletion of /var/lib/nova/instances/432f298f-78dd-4e9e-9ee4-279c2bc544c1_del complete#033[00m
Oct  8 19:06:58 compute-0 systemd[1]: libpod-2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9.scope: Deactivated successfully.
Oct  8 19:06:58 compute-0 podman[145384]: 2025-10-08 19:06:58.66863911 +0000 UTC m=+0.056892664 container died 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:06:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9-userdata-shm.mount: Deactivated successfully.
Oct  8 19:06:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-4f952153b9c075c928f3a4048d93d2e597dc86bce3878df25663d6f266130ce7-merged.mount: Deactivated successfully.
Oct  8 19:06:58 compute-0 podman[145384]: 2025-10-08 19:06:58.723165665 +0000 UTC m=+0.111419189 container cleanup 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 19:06:58 compute-0 systemd[1]: libpod-conmon-2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9.scope: Deactivated successfully.
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.741 2 DEBUG nova.virt.libvirt.host [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.742 2 INFO nova.virt.libvirt.host [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] UEFI support detected#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.744 2 INFO nova.compute.manager [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.745 2 DEBUG oslo.service.loopingcall [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.745 2 DEBUG nova.compute.manager [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.745 2 DEBUG nova.network.neutron [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:06:58 compute-0 podman[145431]: 2025-10-08 19:06:58.782675883 +0000 UTC m=+0.038022783 container remove 2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.787 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6ba1b0a0-47ee-4474-812e-289f9269ad36]: (4, ('Wed Oct  8 07:06:58 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 (2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9)\n2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9\nWed Oct  8 07:06:58 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 (2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9)\n2fa83a290b4ce32236b9a062240e91de3f4adbdabf9275d47c9271f48749cdd9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.789 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d0361c33-fff9-4abc-b4b2-487efb4b6e3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.790 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f19211d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:58 compute-0 kernel: tap3f19211d-10: left promiscuous mode
Oct  8 19:06:58 compute-0 nova_compute[117514]: 2025-10-08 19:06:58.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.810 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba43561-b88e-407c-89bb-5fd74bb0a814]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.841 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a7364aa4-f0d8-4501-b04a-023d3c4abf4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.842 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c6757f75-0dee-406c-96c4-9f805a4c184d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.858 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[01754535-b816-4d2a-9cab-aa716e37c858]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 106711, 'reachable_time': 15840, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 145446, 'error': None, 'target': 'ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:58 compute-0 systemd[1]: run-netns-ovnmeta\x2d3f19211d\x2d1888\x2d42c2\x2da8ff\x2d1de7bc4f9219.mount: Deactivated successfully.
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.867 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f19211d-1888-42c2-a8ff-1de7bc4f9219 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:06:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:06:58.868 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[67f5ffe8-159b-48ea-8e2c-783ab259a6e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.247 2 DEBUG nova.compute.manager [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-unplugged-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.248 2 DEBUG oslo_concurrency.lockutils [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.248 2 DEBUG oslo_concurrency.lockutils [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.249 2 DEBUG oslo_concurrency.lockutils [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.249 2 DEBUG nova.compute.manager [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] No waiting events found dispatching network-vif-unplugged-41ab28a1-9254-46a6-97fc-2220fe30eccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.250 2 DEBUG nova.compute.manager [req-02bd1e3e-7c75-4728-bd73-60dfc30a339b req-ea9de67c-e7c2-4e6f-8b06-031314e5d9da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-unplugged-41ab28a1-9254-46a6-97fc-2220fe30eccd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.497 2 DEBUG nova.network.neutron [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.516 2 INFO nova.compute.manager [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Took 0.77 seconds to deallocate network for instance.#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.563 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.564 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.652 2 DEBUG nova.compute.provider_tree [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.669 2 DEBUG nova.scheduler.client.report [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.694 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.735 2 INFO nova.scheduler.client.report [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 432f298f-78dd-4e9e-9ee4-279c2bc544c1#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.835 2 DEBUG oslo_concurrency.lockutils [None req-978d42a9-cdf9-4edc-ba0a-669fa9cd9ff2 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:06:59 compute-0 nova_compute[117514]: 2025-10-08 19:06:59.923 2 DEBUG nova.compute.manager [req-1cc77f87-0150-409d-b44d-bce4b450fac0 req-035ce338-ced7-45e7-9a63-0ca5d5569dae bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-deleted-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:00 compute-0 nova_compute[117514]: 2025-10-08 19:07:00.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.322 2 DEBUG nova.compute.manager [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.323 2 DEBUG oslo_concurrency.lockutils [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.323 2 DEBUG oslo_concurrency.lockutils [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.324 2 DEBUG oslo_concurrency.lockutils [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "432f298f-78dd-4e9e-9ee4-279c2bc544c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.324 2 DEBUG nova.compute.manager [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] No waiting events found dispatching network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:07:01 compute-0 nova_compute[117514]: 2025-10-08 19:07:01.325 2 WARNING nova.compute.manager [req-57dea52f-ce0d-4477-bf92-4a92af692b04 req-da1b2d6e-4910-4d8c-b0fb-54585411dcd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Received unexpected event network-vif-plugged-41ab28a1-9254-46a6-97fc-2220fe30eccd for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:07:02 compute-0 ovn_controller[19759]: 2025-10-08T19:07:02Z|00042|binding|INFO|Releasing lport f9878aab-28ef-456a-a43a-7cacc2381b1f from this chassis (sb_readonly=0)
Oct  8 19:07:02 compute-0 nova_compute[117514]: 2025-10-08 19:07:02.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.373 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.373 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.374 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.374 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.375 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.377 2 INFO nova.compute.manager [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Terminating instance#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.379 2 DEBUG nova.compute.manager [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:07:03 compute-0 kernel: tap82f4743a-dc (unregistering): left promiscuous mode
Oct  8 19:07:03 compute-0 NetworkManager[1035]: <info>  [1759950423.4083] device (tap82f4743a-dc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 ovn_controller[19759]: 2025-10-08T19:07:03Z|00043|binding|INFO|Releasing lport 82f4743a-dcdc-49f7-be61-94d565e29842 from this chassis (sb_readonly=0)
Oct  8 19:07:03 compute-0 ovn_controller[19759]: 2025-10-08T19:07:03Z|00044|binding|INFO|Setting lport 82f4743a-dcdc-49f7-be61-94d565e29842 down in Southbound
Oct  8 19:07:03 compute-0 ovn_controller[19759]: 2025-10-08T19:07:03Z|00045|binding|INFO|Removing iface tap82f4743a-dc ovn-installed in OVS
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.431 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:6b:6c 10.100.0.3'], port_security=['fa:16:3e:2e:6b:6c 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '533c431a-8ae8-4310-81dc-29285b78f93c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd3706646-002b-4286-ab41-a86fd84e3356', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30f96b84-f723-4541-a1ae-463e873ff4a9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=82f4743a-dcdc-49f7-be61-94d565e29842) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.434 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 82f4743a-dcdc-49f7-be61-94d565e29842 in datapath a913b285-6d0a-478e-aa24-18bb458d8f7a unbound from our chassis#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.436 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a913b285-6d0a-478e-aa24-18bb458d8f7a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.437 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e530d9ac-2335-4bfe-bf2a-d1076203377e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.438 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a namespace which is not needed anymore#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.449 2 DEBUG nova.compute.manager [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.450 2 DEBUG nova.compute.manager [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing instance network info cache due to event network-changed-82f4743a-dcdc-49f7-be61-94d565e29842. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.451 2 DEBUG oslo_concurrency.lockutils [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.452 2 DEBUG oslo_concurrency.lockutils [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.452 2 DEBUG nova.network.neutron [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Refreshing network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Oct  8 19:07:03 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 15.272s CPU time.
Oct  8 19:07:03 compute-0 systemd-machined[77568]: Machine qemu-1-instance-00000001 terminated.
Oct  8 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [NOTICE]   (144822) : haproxy version is 2.8.14-c23fe91
Oct  8 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [NOTICE]   (144822) : path to executable is /usr/sbin/haproxy
Oct  8 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [WARNING]  (144822) : Exiting Master process...
Oct  8 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [WARNING]  (144822) : Exiting Master process...
Oct  8 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [ALERT]    (144822) : Current worker (144824) exited with code 143 (Terminated)
Oct  8 19:07:03 compute-0 neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a[144818]: [WARNING]  (144822) : All workers exited. Exiting... (0)
Oct  8 19:07:03 compute-0 systemd[1]: libpod-3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa.scope: Deactivated successfully.
Oct  8 19:07:03 compute-0 podman[145472]: 2025-10-08 19:07:03.641627421 +0000 UTC m=+0.072023828 container died 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS)
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.661 2 INFO nova.virt.libvirt.driver [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Instance destroyed successfully.#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.661 2 DEBUG nova.objects.instance [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 533c431a-8ae8-4310-81dc-29285b78f93c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:07:03 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa-userdata-shm.mount: Deactivated successfully.
Oct  8 19:07:03 compute-0 systemd[1]: var-lib-containers-storage-overlay-d18ad23441d9b789fc1d272bc7da5f2175a2c9c00ba472ac96b05443d421d00c-merged.mount: Deactivated successfully.
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.683 2 DEBUG nova.virt.libvirt.vif [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:05:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-447228763',display_name='tempest-TestNetworkBasicOps-server-447228763',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-447228763',id=1,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEkUPXM3K1FQRSOHUI4ceK1l6cbpFonPXFALKMkZcGgnSoRiUTQsb/Q287ApBX2G3xb2VwfVQAcm0rggAGmL4bEoFJTCQrQCAGh+fp9j7aUYBxWFzZf4Ok3jDCvBVuh0yA==',key_name='tempest-TestNetworkBasicOps-1885837558',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:05:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-2r2x09q7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:05:59Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=533c431a-8ae8-4310-81dc-29285b78f93c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.684 2 DEBUG nova.network.os_vif_util [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.685 2 DEBUG nova.network.os_vif_util [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.686 2 DEBUG os_vif [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.689 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82f4743a-dc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.698 2 INFO os_vif [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:6b:6c,bridge_name='br-int',has_traffic_filtering=True,id=82f4743a-dcdc-49f7-be61-94d565e29842,network=Network(a913b285-6d0a-478e-aa24-18bb458d8f7a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82f4743a-dc')#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.700 2 INFO nova.virt.libvirt.driver [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Deleting instance files /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c_del#033[00m
Oct  8 19:07:03 compute-0 podman[145472]: 2025-10-08 19:07:03.701084267 +0000 UTC m=+0.131480624 container cleanup 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.701 2 INFO nova.virt.libvirt.driver [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Deletion of /var/lib/nova/instances/533c431a-8ae8-4310-81dc-29285b78f93c_del complete#033[00m
Oct  8 19:07:03 compute-0 systemd[1]: libpod-conmon-3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa.scope: Deactivated successfully.
Oct  8 19:07:03 compute-0 podman[145519]: 2025-10-08 19:07:03.777433078 +0000 UTC m=+0.050079938 container remove 3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.786 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e0be4785-02d3-4ffe-8626-cb143600038c]: (4, ('Wed Oct  8 07:07:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a (3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa)\n3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa\nWed Oct  8 07:07:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a (3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa)\n3efae6d2598078a444d1d0b5df7fb7ce2c474b330f618d9a2595c2a8e415d9fa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.788 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e1db45f3-209c-4d26-8575-3344ddeb145c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.790 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa913b285-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:03 compute-0 kernel: tapa913b285-60: left promiscuous mode
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.799 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[46bf249b-3287-473b-8361-30aafa17f97e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.847 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[2aaf48f8-2f4d-40b5-9054-976348a76769]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.849 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[578be7e4-7d3a-4aac-82a6-22a4ab2c81f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.850 2 INFO nova.compute.manager [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Took 0.47 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.851 2 DEBUG oslo.service.loopingcall [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.853 2 DEBUG nova.compute.manager [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:07:03 compute-0 nova_compute[117514]: 2025-10-08 19:07:03.853 2 DEBUG nova.network.neutron [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.872 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[36c6e66d-58de-4837-a539-6b1db93c5565]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 103214, 'reachable_time': 24386, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 145535, 'error': None, 'target': 'ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.874 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a913b285-6d0a-478e-aa24-18bb458d8f7a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:07:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:03.874 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[d822941a-7283-46f1-b916-71bee9e53e34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:03 compute-0 systemd[1]: run-netns-ovnmeta\x2da913b285\x2d6d0a\x2d478e\x2daa24\x2d18bb458d8f7a.mount: Deactivated successfully.
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.358 2 DEBUG nova.network.neutron [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.380 2 INFO nova.compute.manager [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Took 0.53 seconds to deallocate network for instance.#033[00m
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.439 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.440 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.513 2 DEBUG nova.compute.provider_tree [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.529 2 DEBUG nova.scheduler.client.report [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.548 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.575 2 INFO nova.scheduler.client.report [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 533c431a-8ae8-4310-81dc-29285b78f93c#033[00m
Oct  8 19:07:04 compute-0 nova_compute[117514]: 2025-10-08 19:07:04.651 2 DEBUG oslo_concurrency.lockutils [None req-41a5d4ad-2c21-407b-b153-89b5f2fa7d6a efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.060 2 DEBUG nova.network.neutron [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updated VIF entry in instance network info cache for port 82f4743a-dcdc-49f7-be61-94d565e29842. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.061 2 DEBUG nova.network.neutron [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Updating instance_info_cache with network_info: [{"id": "82f4743a-dcdc-49f7-be61-94d565e29842", "address": "fa:16:3e:2e:6b:6c", "network": {"id": "a913b285-6d0a-478e-aa24-18bb458d8f7a", "bridge": "br-int", "label": "tempest-network-smoke--994528417", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82f4743a-dc", "ovs_interfaceid": "82f4743a-dcdc-49f7-be61-94d565e29842", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.079 2 DEBUG oslo_concurrency.lockutils [req-70979864-b68c-4cf6-a572-e745794d451f req-acb49dd7-77f0-4780-b68c-54f5ba7035f8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-533c431a-8ae8-4310-81dc-29285b78f93c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.574 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-unplugged-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.575 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.575 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.576 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.576 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] No waiting events found dispatching network-vif-unplugged-82f4743a-dcdc-49f7-be61-94d565e29842 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.577 2 WARNING nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received unexpected event network-vif-unplugged-82f4743a-dcdc-49f7-be61-94d565e29842 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.577 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.578 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.579 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.579 2 DEBUG oslo_concurrency.lockutils [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "533c431a-8ae8-4310-81dc-29285b78f93c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.580 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] No waiting events found dispatching network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.580 2 WARNING nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received unexpected event network-vif-plugged-82f4743a-dcdc-49f7-be61-94d565e29842 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.581 2 DEBUG nova.compute.manager [req-637c35c1-aa0a-4b96-a5c3-ce3c5fb0defb req-a0487a94-3dd2-4483-bd01-9e7afc7906ee bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Received event network-vif-deleted-82f4743a-dcdc-49f7-be61-94d565e29842 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.958 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.959 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.974 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.974 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.974 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.987 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.987 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.987 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.987 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.988 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.988 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.988 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:07:05 compute-0 nova_compute[117514]: 2025-10-08 19:07:05.989 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.008 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.008 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.008 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.009 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:07:06 compute-0 podman[145537]: 2025-10-08 19:07:06.149233853 +0000 UTC m=+0.083483947 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.212 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.213 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6103MB free_disk=73.42378616333008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.213 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.213 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.285 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.285 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.322 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.335 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.357 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:07:06 compute-0 nova_compute[117514]: 2025-10-08 19:07:06.358 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:07 compute-0 nova_compute[117514]: 2025-10-08 19:07:07.088 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:07:07 compute-0 nova_compute[117514]: 2025-10-08 19:07:07.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:08 compute-0 nova_compute[117514]: 2025-10-08 19:07:08.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:08 compute-0 nova_compute[117514]: 2025-10-08 19:07:08.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:10 compute-0 nova_compute[117514]: 2025-10-08 19:07:10.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:13 compute-0 nova_compute[117514]: 2025-10-08 19:07:13.626 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950418.625193, 432f298f-78dd-4e9e-9ee4-279c2bc544c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:07:13 compute-0 nova_compute[117514]: 2025-10-08 19:07:13.627 2 INFO nova.compute.manager [-] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:07:13 compute-0 nova_compute[117514]: 2025-10-08 19:07:13.652 2 DEBUG nova.compute.manager [None req-eebc01ce-a890-4c54-96c8-1687ed263f54 - - - - - -] [instance: 432f298f-78dd-4e9e-9ee4-279c2bc544c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:07:13 compute-0 nova_compute[117514]: 2025-10-08 19:07:13.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:15 compute-0 nova_compute[117514]: 2025-10-08 19:07:15.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:15 compute-0 podman[145562]: 2025-10-08 19:07:15.667541694 +0000 UTC m=+0.092552507 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm)
Oct  8 19:07:18 compute-0 nova_compute[117514]: 2025-10-08 19:07:18.657 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950423.6562872, 533c431a-8ae8-4310-81dc-29285b78f93c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:07:18 compute-0 nova_compute[117514]: 2025-10-08 19:07:18.657 2 INFO nova.compute.manager [-] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:07:18 compute-0 nova_compute[117514]: 2025-10-08 19:07:18.676 2 DEBUG nova.compute.manager [None req-a69db70a-3252-46f5-84dd-d499a8296a62 - - - - - -] [instance: 533c431a-8ae8-4310-81dc-29285b78f93c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:07:18 compute-0 nova_compute[117514]: 2025-10-08 19:07:18.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:20 compute-0 nova_compute[117514]: 2025-10-08 19:07:20.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:21 compute-0 podman[145582]: 2025-10-08 19:07:21.65709319 +0000 UTC m=+0.072425360 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 19:07:21 compute-0 podman[145583]: 2025-10-08 19:07:21.660958201 +0000 UTC m=+0.071259117 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.357 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.358 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.382 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.498 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.498 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.508 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.508 2 INFO nova.compute.claims [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.629 2 DEBUG nova.compute.provider_tree [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.647 2 DEBUG nova.scheduler.client.report [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.672 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.673 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.713 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.714 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.731 2 INFO nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.750 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.829 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.832 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.833 2 INFO nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Creating image(s)#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.834 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.834 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.836 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.857 2 DEBUG nova.policy [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.859 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.922 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.924 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.925 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:22 compute-0 nova_compute[117514]: 2025-10-08 19:07:22.951 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.010 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.012 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.177 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk 1073741824" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.179 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.180 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.252 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.253 2 DEBUG nova.virt.disk.api [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.254 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.309 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.310 2 DEBUG nova.virt.disk.api [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.311 2 DEBUG nova.objects.instance [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.325 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.325 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Ensure instance console log exists: /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.326 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.326 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.327 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:23 compute-0 nova_compute[117514]: 2025-10-08 19:07:23.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:25 compute-0 nova_compute[117514]: 2025-10-08 19:07:25.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:25 compute-0 podman[145638]: 2025-10-08 19:07:25.652107867 +0000 UTC m=+0.074556170 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 19:07:26 compute-0 nova_compute[117514]: 2025-10-08 19:07:26.182 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Successfully created port: 0107be0e-1b4b-47dd-9422-a435ded0964c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:07:27 compute-0 podman[145664]: 2025-10-08 19:07:27.65660042 +0000 UTC m=+0.059593041 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent)
Oct  8 19:07:27 compute-0 podman[145663]: 2025-10-08 19:07:27.697953387 +0000 UTC m=+0.076627580 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=iscsid, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.021 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Successfully updated port: 0107be0e-1b4b-47dd-9422-a435ded0964c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.039 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.039 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.039 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.120 2 DEBUG nova.compute.manager [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.121 2 DEBUG nova.compute.manager [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing instance network info cache due to event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.121 2 DEBUG oslo_concurrency.lockutils [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.196 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:07:28 compute-0 nova_compute[117514]: 2025-10-08 19:07:28.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.265 2 DEBUG nova.network.neutron [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.290 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.291 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance network_info: |[{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.291 2 DEBUG oslo_concurrency.lockutils [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.291 2 DEBUG nova.network.neutron [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.295 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start _get_guest_xml network_info=[{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.299 2 WARNING nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.304 2 DEBUG nova.virt.libvirt.host [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.305 2 DEBUG nova.virt.libvirt.host [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.311 2 DEBUG nova.virt.libvirt.host [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.312 2 DEBUG nova.virt.libvirt.host [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.312 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.312 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.313 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.313 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.314 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.314 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.314 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.315 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.315 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.315 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.315 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.316 2 DEBUG nova.virt.hardware [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.319 2 DEBUG nova.virt.libvirt.vif [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:07:22Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.320 2 DEBUG nova.network.os_vif_util [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.321 2 DEBUG nova.network.os_vif_util [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.322 2 DEBUG nova.objects.instance [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.335 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <name>instance-00000003</name>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:07:29</nova:creationTime>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:07:29 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:07:29 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:07:29 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:07:29 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:07:29 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:07:29 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:07:29 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:07:29 compute-0 nova_compute[117514]:        <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct  8 19:07:29 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <system>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <entry name="serial">b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <entry name="uuid">b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </system>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <os>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  </os>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <features>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  </features>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:d7:63:9d"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <target dev="tap0107be0e-1b"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log" append="off"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <video>
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </video>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:07:29 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:07:29 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:07:29 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:07:29 compute-0 nova_compute[117514]: </domain>
Oct  8 19:07:29 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.336 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Preparing to wait for external event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.337 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.337 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.337 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.338 2 DEBUG nova.virt.libvirt.vif [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:07:22Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.339 2 DEBUG nova.network.os_vif_util [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.339 2 DEBUG nova.network.os_vif_util [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.340 2 DEBUG os_vif [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.341 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.341 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0107be0e-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.344 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0107be0e-1b, col_values=(('external_ids', {'iface-id': '0107be0e-1b4b-47dd-9422-a435ded0964c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:63:9d', 'vm-uuid': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:29 compute-0 NetworkManager[1035]: <info>  [1759950449.3480] manager: (tap0107be0e-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.353 2 INFO os_vif [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b')#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.485 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.486 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.487 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:d7:63:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:07:29 compute-0 nova_compute[117514]: 2025-10-08 19:07:29.488 2 INFO nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Using config drive#033[00m
Oct  8 19:07:29 compute-0 podman[145706]: 2025-10-08 19:07:29.704552358 +0000 UTC m=+0.119535571 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.167 2 INFO nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Creating config drive at /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config#033[00m
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.176 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa7w4x7v3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.318 2 DEBUG oslo_concurrency.processutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa7w4x7v3" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:07:30 compute-0 kernel: tap0107be0e-1b: entered promiscuous mode
Oct  8 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00046|binding|INFO|Claiming lport 0107be0e-1b4b-47dd-9422-a435ded0964c for this chassis.
Oct  8 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00047|binding|INFO|0107be0e-1b4b-47dd-9422-a435ded0964c: Claiming fa:16:3e:d7:63:9d 10.100.0.6
Oct  8 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.3994] manager: (tap0107be0e-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.421 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:63:9d 10.100.0.6'], port_security=['fa:16:3e:d7:63:9d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15690acb-54cf-4081-a718-c14a1c0af6a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '18c7314c-d74a-4643-933f-4dc6b05c33cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9980b68-53e4-4dfd-a3d6-cbcaebcf011d, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=0107be0e-1b4b-47dd-9422-a435ded0964c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.424 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 0107be0e-1b4b-47dd-9422-a435ded0964c in datapath 15690acb-54cf-4081-a718-c14a1c0af6a8 bound to our chassis#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.426 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 15690acb-54cf-4081-a718-c14a1c0af6a8#033[00m
Oct  8 19:07:30 compute-0 systemd-udevd[145750]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.445 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[524d61a5-11f7-4b43-91be-b9c720ff22cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.446 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap15690acb-51 in ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.448 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap15690acb-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.448 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3c63d004-7be7-4ce9-a57c-b3408a58dcef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.449 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[db5da360-dd79-4076-91af-ad769c2861f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.4557] device (tap0107be0e-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.4571] device (tap0107be0e-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:07:30 compute-0 systemd-machined[77568]: New machine qemu-3-instance-00000003.
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.462 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[35d09d2a-17b6-46d5-8ede-bf0eff73ec88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:30 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Oct  8 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00048|binding|INFO|Setting lport 0107be0e-1b4b-47dd-9422-a435ded0964c ovn-installed in OVS
Oct  8 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00049|binding|INFO|Setting lport 0107be0e-1b4b-47dd-9422-a435ded0964c up in Southbound
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.503 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[826fb329-891f-437a-96a0-190932833abb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.538 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4fa0e4-970c-4bd7-ab2e-704723b60344]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.543 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6056d3-ccb6-4c2a-8333-81644c7988f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.5452] manager: (tap15690acb-50): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.577 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[5dddfa28-4455-4ba0-be34-78e46748ac82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.579 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[c87e521e-3655-46db-a688-bc39a846d476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.6018] device (tap15690acb-50): carrier: link connected
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.607 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[02d0c148-3d15-46a2-9950-9f0ebbc6beb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.629 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e29dc5c4-50e9-4ac9-b64a-051e2b9aace8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap15690acb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:13:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 112118, 'reachable_time': 42695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 145783, 'error': None, 'target': 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.653 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4637b344-7ac6-415c-b405-0ad259d1a24a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb8:1395'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 112118, 'tstamp': 112118}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 145784, 'error': None, 'target': 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.674 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ae09ee8d-c651-43e1-84d9-cc23fcda1ef4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap15690acb-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b8:13:95'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 112118, 'reachable_time': 42695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 145785, 'error': None, 'target': 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.720 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[04c0eec0-442a-4fa7-b75f-61cf47c677f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.799 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[86b18965-93ef-4d98-8ad2-76c55f0e45e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.800 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15690acb-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.800 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.800 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15690acb-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:30 compute-0 kernel: tap15690acb-50: entered promiscuous mode
Oct  8 19:07:30 compute-0 NetworkManager[1035]: <info>  [1759950450.8032] manager: (tap15690acb-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.805 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap15690acb-50, col_values=(('external_ids', {'iface-id': 'b2172a75-691e-43ff-a242-3b06a5bfd197'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:30 compute-0 ovn_controller[19759]: 2025-10-08T19:07:30Z|00050|binding|INFO|Releasing lport b2172a75-691e-43ff-a242-3b06a5bfd197 from this chassis (sb_readonly=0)
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.808 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/15690acb-54cf-4081-a718-c14a1c0af6a8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/15690acb-54cf-4081-a718-c14a1c0af6a8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.809 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cb8caff9-c90d-430f-8d89-06be28316676]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.809 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-15690acb-54cf-4081-a718-c14a1c0af6a8
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/15690acb-54cf-4081-a718-c14a1c0af6a8.pid.haproxy
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID 15690acb-54cf-4081-a718-c14a1c0af6a8
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:07:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:30.810 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'env', 'PROCESS_TAG=haproxy-15690acb-54cf-4081-a718-c14a1c0af6a8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/15690acb-54cf-4081-a718-c14a1c0af6a8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:07:30 compute-0 nova_compute[117514]: 2025-10-08 19:07:30.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.092 2 DEBUG nova.network.neutron [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updated VIF entry in instance network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.093 2 DEBUG nova.network.neutron [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.128 2 DEBUG oslo_concurrency.lockutils [req-d576261a-f88c-4a06-b12e-1473a2dddcaa req-d8d75571-6be0-49a7-afb8-bcf1c167c4d7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.172 2 DEBUG nova.compute.manager [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.173 2 DEBUG oslo_concurrency.lockutils [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.173 2 DEBUG oslo_concurrency.lockutils [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.173 2 DEBUG oslo_concurrency.lockutils [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.173 2 DEBUG nova.compute.manager [req-b259494c-4afb-4e90-8111-d49581857088 req-d2a7e073-7015-48a1-bb79-ebbfec6443c4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Processing event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:07:31 compute-0 podman[145817]: 2025-10-08 19:07:31.150146162 +0000 UTC m=+0.020985601 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:07:31 compute-0 podman[145817]: 2025-10-08 19:07:31.512272797 +0000 UTC m=+0.383112246 container create f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:07:31 compute-0 systemd[1]: Started libpod-conmon-f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c.scope.
Oct  8 19:07:31 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:07:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3dc6caed12c224c239b697cd06381493a291f904c3b4b3172b2f62f362bdce12/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:07:31 compute-0 podman[145817]: 2025-10-08 19:07:31.822428882 +0000 UTC m=+0.693268401 container init f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:07:31 compute-0 podman[145817]: 2025-10-08 19:07:31.831587197 +0000 UTC m=+0.702426656 container start f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 19:07:31 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [NOTICE]   (145844) : New worker (145846) forked
Oct  8 19:07:31 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [NOTICE]   (145844) : Loading success.
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.994 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950451.9940875, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.995 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] VM Started (Lifecycle Event)#033[00m
Oct  8 19:07:31 compute-0 nova_compute[117514]: 2025-10-08 19:07:31.998 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.002 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.006 2 INFO nova.virt.libvirt.driver [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance spawned successfully.#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.008 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.016 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.021 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.038 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.038 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.039 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.039 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.040 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.040 2 DEBUG nova.virt.libvirt.driver [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.052 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.053 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950451.9941866, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.053 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.086 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.091 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950452.001077, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.092 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.119 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.125 2 INFO nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Took 9.29 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.125 2 DEBUG nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.126 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.157 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.186 2 INFO nova.compute.manager [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Took 9.73 seconds to build instance.#033[00m
Oct  8 19:07:32 compute-0 nova_compute[117514]: 2025-10-08 19:07:32.198 2 DEBUG oslo_concurrency.lockutils [None req-fbe674a9-a1c8-4e5b-852c-572d233220ab efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.840s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.257 2 DEBUG nova.compute.manager [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.257 2 DEBUG oslo_concurrency.lockutils [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.258 2 DEBUG oslo_concurrency.lockutils [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.259 2 DEBUG oslo_concurrency.lockutils [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.259 2 DEBUG nova.compute.manager [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:07:33 compute-0 nova_compute[117514]: 2025-10-08 19:07:33.260 2 WARNING nova.compute.manager [req-fa73bb40-f60d-4f07-94d6-d7ddc561fff0 req-c01fdfb7-8386-4b3c-b240-dd999f182d41 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c for instance with vm_state active and task_state None.#033[00m
Oct  8 19:07:34 compute-0 nova_compute[117514]: 2025-10-08 19:07:34.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:35 compute-0 nova_compute[117514]: 2025-10-08 19:07:35.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:36 compute-0 ovn_controller[19759]: 2025-10-08T19:07:36Z|00051|binding|INFO|Releasing lport b2172a75-691e-43ff-a242-3b06a5bfd197 from this chassis (sb_readonly=0)
Oct  8 19:07:36 compute-0 NetworkManager[1035]: <info>  [1759950456.4866] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Oct  8 19:07:36 compute-0 NetworkManager[1035]: <info>  [1759950456.4879] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct  8 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:36 compute-0 ovn_controller[19759]: 2025-10-08T19:07:36Z|00052|binding|INFO|Releasing lport b2172a75-691e-43ff-a242-3b06a5bfd197 from this chassis (sb_readonly=0)
Oct  8 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:36 compute-0 podman[145857]: 2025-10-08 19:07:36.678310169 +0000 UTC m=+0.082985353 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.796 2 DEBUG nova.compute.manager [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.797 2 DEBUG nova.compute.manager [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing instance network info cache due to event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.797 2 DEBUG oslo_concurrency.lockutils [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.798 2 DEBUG oslo_concurrency.lockutils [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:07:36 compute-0 nova_compute[117514]: 2025-10-08 19:07:36.798 2 DEBUG nova.network.neutron [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:07:38 compute-0 nova_compute[117514]: 2025-10-08 19:07:38.257 2 DEBUG nova.network.neutron [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updated VIF entry in instance network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:07:38 compute-0 nova_compute[117514]: 2025-10-08 19:07:38.258 2 DEBUG nova.network.neutron [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:07:38 compute-0 nova_compute[117514]: 2025-10-08 19:07:38.297 2 DEBUG oslo_concurrency.lockutils [req-d9c34c21-d660-420c-8c12-b9bd70bc0a64 req-50659bf2-2d61-487d-a504-f786c54340bc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:07:39 compute-0 nova_compute[117514]: 2025-10-08 19:07:39.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:40 compute-0 nova_compute[117514]: 2025-10-08 19:07:40.545 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:43 compute-0 ovn_controller[19759]: 2025-10-08T19:07:43Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d7:63:9d 10.100.0.6
Oct  8 19:07:43 compute-0 ovn_controller[19759]: 2025-10-08T19:07:43Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d7:63:9d 10.100.0.6
Oct  8 19:07:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:44.228 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:44.229 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:44.230 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:44 compute-0 nova_compute[117514]: 2025-10-08 19:07:44.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:45 compute-0 nova_compute[117514]: 2025-10-08 19:07:45.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:46 compute-0 podman[145899]: 2025-10-08 19:07:46.682038475 +0000 UTC m=+0.094762517 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:07:49 compute-0 nova_compute[117514]: 2025-10-08 19:07:49.117 2 INFO nova.compute.manager [None req-71e8e984-2524-4caf-8dc5-182df0d9ea11 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Get console output#033[00m
Oct  8 19:07:49 compute-0 nova_compute[117514]: 2025-10-08 19:07:49.126 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:07:49 compute-0 nova_compute[117514]: 2025-10-08 19:07:49.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:50 compute-0 nova_compute[117514]: 2025-10-08 19:07:50.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:52 compute-0 podman[145923]: 2025-10-08 19:07:52.676441624 +0000 UTC m=+0.092589922 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 19:07:52 compute-0 podman[145924]: 2025-10-08 19:07:52.680727653 +0000 UTC m=+0.091333184 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:07:53 compute-0 nova_compute[117514]: 2025-10-08 19:07:53.292 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:07:53 compute-0 nova_compute[117514]: 2025-10-08 19:07:53.292 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:07:53 compute-0 nova_compute[117514]: 2025-10-08 19:07:53.292 2 DEBUG nova.objects.instance [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'flavor' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:07:54 compute-0 nova_compute[117514]: 2025-10-08 19:07:54.196 2 DEBUG nova.objects.instance [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_requests' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:07:54 compute-0 nova_compute[117514]: 2025-10-08 19:07:54.213 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:07:54 compute-0 nova_compute[117514]: 2025-10-08 19:07:54.350 2 DEBUG nova.policy [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:07:54 compute-0 nova_compute[117514]: 2025-10-08 19:07:54.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:55 compute-0 nova_compute[117514]: 2025-10-08 19:07:55.373 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Successfully created port: 6943627d-6614-41cb-9460-f0454c6defb1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:07:55 compute-0 nova_compute[117514]: 2025-10-08 19:07:55.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:56 compute-0 podman[145966]: 2025-10-08 19:07:56.621422006 +0000 UTC m=+0.049655182 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.098 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Successfully updated port: 6943627d-6614-41cb-9460-f0454c6defb1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.115 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.115 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.115 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.224 2 DEBUG nova.compute.manager [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-changed-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.225 2 DEBUG nova.compute.manager [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing instance network info cache due to event network-changed-6943627d-6614-41cb-9460-f0454c6defb1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:07:57 compute-0 nova_compute[117514]: 2025-10-08 19:07:57.225 2 DEBUG oslo_concurrency.lockutils [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:07:58 compute-0 podman[145991]: 2025-10-08 19:07:58.672930666 +0000 UTC m=+0.076791338 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 19:07:58 compute-0 podman[145990]: 2025-10-08 19:07:58.698307798 +0000 UTC m=+0.108583652 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3)
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.530 2 DEBUG nova.network.neutron [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.550 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.552 2 DEBUG oslo_concurrency.lockutils [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.553 2 DEBUG nova.network.neutron [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing network info cache for port 6943627d-6614-41cb-9460-f0454c6defb1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.557 2 DEBUG nova.virt.libvirt.vif [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.558 2 DEBUG nova.network.os_vif_util [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.559 2 DEBUG nova.network.os_vif_util [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.560 2 DEBUG os_vif [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.563 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.563 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.574 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6943627d-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.575 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6943627d-66, col_values=(('external_ids', {'iface-id': '6943627d-6614-41cb-9460-f0454c6defb1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bc:a5:e4', 'vm-uuid': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.5777] manager: (tap6943627d-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.587 2 INFO os_vif [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66')#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.588 2 DEBUG nova.virt.libvirt.vif [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.589 2 DEBUG nova.network.os_vif_util [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.589 2 DEBUG nova.network.os_vif_util [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.592 2 DEBUG nova.virt.libvirt.guest [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] attach device xml: <interface type="ethernet">
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <mac address="fa:16:3e:bc:a5:e4"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <model type="virtio"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <mtu size="1442"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <target dev="tap6943627d-66"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]: </interface>
Oct  8 19:07:59 compute-0 nova_compute[117514]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  8 19:07:59 compute-0 kernel: tap6943627d-66: entered promiscuous mode
Oct  8 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.6070] manager: (tap6943627d-66): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Oct  8 19:07:59 compute-0 ovn_controller[19759]: 2025-10-08T19:07:59Z|00053|binding|INFO|Claiming lport 6943627d-6614-41cb-9460-f0454c6defb1 for this chassis.
Oct  8 19:07:59 compute-0 ovn_controller[19759]: 2025-10-08T19:07:59Z|00054|binding|INFO|6943627d-6614-41cb-9460-f0454c6defb1: Claiming fa:16:3e:bc:a5:e4 10.100.0.29
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.618 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:a5:e4 10.100.0.29'], port_security=['fa:16:3e:bc:a5:e4 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4df9aed3-d2c0-400e-9a01-f8aebdd77f61, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=6943627d-6614-41cb-9460-f0454c6defb1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.620 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 6943627d-6614-41cb-9460-f0454c6defb1 in datapath c73d9547-8a91-4802-82a8-1a3a035fe63c bound to our chassis#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.621 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c73d9547-8a91-4802-82a8-1a3a035fe63c#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.638 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d9ab4f-f9f8-46bf-9306-85f6bee831d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.640 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc73d9547-81 in ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.644 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc73d9547-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.645 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[42273a2d-a8ca-4b5b-b10f-8d6e91a9529c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.646 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[343c0c4c-1f14-4bc7-9fc7-ee22316609eb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 systemd-udevd[146037]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.668 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[1675ad6f-cd4d-4ecb-91f9-19f21f32a2f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.677 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:59 compute-0 ovn_controller[19759]: 2025-10-08T19:07:59Z|00055|binding|INFO|Setting lport 6943627d-6614-41cb-9460-f0454c6defb1 ovn-installed in OVS
Oct  8 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.6833] device (tap6943627d-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:07:59 compute-0 ovn_controller[19759]: 2025-10-08T19:07:59Z|00056|binding|INFO|Setting lport 6943627d-6614-41cb-9460-f0454c6defb1 up in Southbound
Oct  8 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.6841] device (tap6943627d-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.701 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ce478e36-ae47-4de9-92e7-24356054d7c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.736 2 DEBUG nova.virt.libvirt.driver [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.736 2 DEBUG nova.virt.libvirt.driver [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.737 2 DEBUG nova.virt.libvirt.driver [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:d7:63:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.737 2 DEBUG nova.virt.libvirt.driver [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:bc:a5:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.740 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9afb3c-8f76-4acc-9f67-d7f5e7db329f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 systemd-udevd[146040]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.746 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9db9c83b-45f9-4f7f-9fae-eedf56774ed9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.7477] manager: (tapc73d9547-80): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.762 2 DEBUG nova.virt.libvirt.guest [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:07:59</nova:creationTime>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct  8 19:07:59 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    <nova:port uuid="6943627d-6614-41cb-9460-f0454c6defb1">
Oct  8 19:07:59 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct  8 19:07:59 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:07:59 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:07:59 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:07:59 compute-0 nova_compute[117514]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 19:07:59 compute-0 nova_compute[117514]: 2025-10-08 19:07:59.790 2 DEBUG oslo_concurrency.lockutils [None req-329deb7c-b603-4e8f-bfa2-987cc1a21b61 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 6.498s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.799 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[0b566470-a0d5-4810-b7d6-b301ee2109a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.802 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e17852c6-1877-4c60-ab48-dfea238bfe31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 NetworkManager[1035]: <info>  [1759950479.8395] device (tapc73d9547-80): carrier: link connected
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.850 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e19ebe3e-7cdb-44c8-a41c-f373770b0b4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.875 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb3443e-cdb4-4fe1-997d-1b09ddc09b71]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc73d9547-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:68:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 115041, 'reachable_time': 35171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146081, 'error': None, 'target': 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.897 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1dce7aff-cd74-422e-8acd-407a1e60ec92]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3f:68c4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 115041, 'tstamp': 115041}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146086, 'error': None, 'target': 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 podman[146053]: 2025-10-08 19:07:59.909839332 +0000 UTC m=+0.117814149 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.923 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[693e2f04-f15a-415f-94be-615b5d8a0eac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc73d9547-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:68:c4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 115041, 'reachable_time': 35171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 146090, 'error': None, 'target': 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:07:59 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:07:59.956 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[69905e99-2329-4dbb-bfdf-ecf3aab571c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.036 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[dd31f513-c07e-461f-a0e9-faf00a637cbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.038 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc73d9547-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.039 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.039 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc73d9547-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:00 compute-0 kernel: tapc73d9547-80: entered promiscuous mode
Oct  8 19:08:00 compute-0 NetworkManager[1035]: <info>  [1759950480.0428] manager: (tapc73d9547-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.046 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc73d9547-80, col_values=(('external_ids', {'iface-id': 'c436eb15-2527-4c5e-bb8f-6f582c6a8cdd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:00 compute-0 ovn_controller[19759]: 2025-10-08T19:08:00Z|00057|binding|INFO|Releasing lport c436eb15-2527-4c5e-bb8f-6f582c6a8cdd from this chassis (sb_readonly=0)
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.049 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c73d9547-8a91-4802-82a8-1a3a035fe63c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c73d9547-8a91-4802-82a8-1a3a035fe63c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.050 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6288806b-d32c-4714-8c06-87d1b3199b2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.051 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-c73d9547-8a91-4802-82a8-1a3a035fe63c
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/c73d9547-8a91-4802-82a8-1a3a035fe63c.pid.haproxy
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID c73d9547-8a91-4802-82a8-1a3a035fe63c
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.051 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'env', 'PROCESS_TAG=haproxy-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c73d9547-8a91-4802-82a8-1a3a035fe63c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.315 2 DEBUG nova.compute.manager [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.316 2 DEBUG oslo_concurrency.lockutils [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.317 2 DEBUG oslo_concurrency.lockutils [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.317 2 DEBUG oslo_concurrency.lockutils [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.317 2 DEBUG nova.compute.manager [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.318 2 WARNING nova.compute.manager [req-1bc7b36b-1905-4bcd-8f67-cfb96eb3762d req-e6765698-c1d3-48c7-8bfb-c9baa6b8da8e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.420 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:00 compute-0 podman[146122]: 2025-10-08 19:08:00.489070587 +0000 UTC m=+0.098926122 container create e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:08:00 compute-0 podman[146122]: 2025-10-08 19:08:00.431406385 +0000 UTC m=+0.041261960 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:08:00 compute-0 systemd[1]: Started libpod-conmon-e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637.scope.
Oct  8 19:08:00 compute-0 nova_compute[117514]: 2025-10-08 19:08:00.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:00 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:08:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/274f69a4d16ffcd141ca0c2aabf5d247e94ba26d928301563c5c3be5cc17c132/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:08:00 compute-0 podman[146122]: 2025-10-08 19:08:00.605915106 +0000 UTC m=+0.215770671 container init e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:08:00 compute-0 podman[146122]: 2025-10-08 19:08:00.634655079 +0000 UTC m=+0.244510614 container start e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  8 19:08:00 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [NOTICE]   (146141) : New worker (146143) forked
Oct  8 19:08:00 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [NOTICE]   (146141) : Loading success.
Oct  8 19:08:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:00.710 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.319 2 DEBUG nova.network.neutron [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updated VIF entry in instance network info cache for port 6943627d-6614-41cb-9460-f0454c6defb1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.320 2 DEBUG nova.network.neutron [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.337 2 DEBUG oslo_concurrency.lockutils [req-1731de79-2251-41c4-8ced-d39a834649c9 req-79a98552-bcdc-46fd-b25f-317a3475f875 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bc:a5:e4 10.100.0.29
Oct  8 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bc:a5:e4 10.100.0.29
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.452 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-6943627d-6614-41cb-9460-f0454c6defb1" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.453 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-6943627d-6614-41cb-9460-f0454c6defb1" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.469 2 DEBUG nova.objects.instance [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'flavor' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.491 2 DEBUG nova.virt.libvirt.vif [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.492 2 DEBUG nova.network.os_vif_util [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.493 2 DEBUG nova.network.os_vif_util [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.498 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.501 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.505 2 DEBUG nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Attempting to detach device tap6943627d-66 from instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.506 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] detach device xml: <interface type="ethernet">
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <mac address="fa:16:3e:bc:a5:e4"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <model type="virtio"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <mtu size="1442"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <target dev="tap6943627d-66"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]: </interface>
Oct  8 19:08:01 compute-0 nova_compute[117514]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.513 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.518 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface>not found in domain: <domain type='kvm' id='3'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <name>instance-00000003</name>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:07:59</nova:creationTime>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:port uuid="6943627d-6614-41cb-9460-f0454c6defb1">
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:08:01 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <memory unit='KiB'>131072</memory>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <vcpu placement='static'>1</vcpu>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <resource>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <partition>/machine</partition>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </resource>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <sysinfo type='smbios'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <system>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='manufacturer'>RDO</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='product'>OpenStack Compute</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='serial'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='uuid'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='family'>Virtual Machine</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </system>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <os>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <boot dev='hd'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <smbios mode='sysinfo'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </os>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <features>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <vmcoreinfo state='on'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </features>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <cpu mode='custom' match='exact' check='full'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <vendor>AMD</vendor>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='x2apic'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc-deadline'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='hypervisor'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc_adjust'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='spec-ctrl'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='stibp'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='arch-capabilities'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='ssbd'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='cmp_legacy'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='overflow-recov'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='succor'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='ibrs'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='amd-ssbd'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='virt-ssbd'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='lbrv'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='tsc-scale'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='vmcb-clean'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='flushbyasid'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='pause-filter'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='pfthreshold'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='svme-addr-chk'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='rdctl-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='mds-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='gds-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='rfds-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='xsaves'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='svm'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='topoext'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='npt'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='nrip-save'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <clock offset='utc'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <timer name='pit' tickpolicy='delay'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <timer name='hpet' present='no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <on_poweroff>destroy</on_poweroff>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <on_reboot>restart</on_reboot>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <on_crash>destroy</on_crash>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <disk type='file' device='disk'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk' index='2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <backingStore type='file' index='3'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:        <format type='raw'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:        <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:        <backingStore/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      </backingStore>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target dev='vda' bus='virtio'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='virtio-disk0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <disk type='file' device='cdrom'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <driver name='qemu' type='raw' cache='none'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config' index='1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <backingStore/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target dev='sda' bus='sata'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <readonly/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='sata0-0-0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='0' model='pcie-root'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pcie.0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='1' port='0x10'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='2' port='0x11'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='3' port='0x12'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.3'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='4' port='0x13'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='5' port='0x14'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.5'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='6' port='0x15'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.6'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='7' port='0x16'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.7'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='8' port='0x17'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.8'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='9' port='0x18'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.9'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='10' port='0x19'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.10'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='11' port='0x1a'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.11'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='12' port='0x1b'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.12'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='13' port='0x1c'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.13'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='14' port='0x1d'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.14'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='15' port='0x1e'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.15'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='16' port='0x1f'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.16'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='17' port='0x20'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.17'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='18' port='0x21'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.18'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='19' port='0x22'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.19'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='20' port='0x23'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.20'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='21' port='0x24'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.21'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='22' port='0x25'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.22'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='23' port='0x26'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.23'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='24' port='0x27'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.24'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='25' port='0x28'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.25'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-pci-bridge'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.26'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='usb'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='sata' index='0'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='ide'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <interface type='ethernet'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <mac address='fa:16:3e:d7:63:9d'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target dev='tap0107be0e-1b'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model type='virtio'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <driver name='vhost' rx_queue_size='512'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <mtu size='1442'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='net0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <interface type='ethernet'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <mac address='fa:16:3e:bc:a5:e4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target dev='tap6943627d-66'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model type='virtio'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <driver name='vhost' rx_queue_size='512'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <mtu size='1442'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='net1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <serial type='pty'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target type='isa-serial' port='0'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:        <model name='isa-serial'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      </target>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <console type='pty' tty='/dev/pts/0'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target type='serial' port='0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </console>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <input type='tablet' bus='usb'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='input0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='usb' bus='0' port='1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <input type='mouse' bus='ps2'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='input1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <input type='keyboard' bus='ps2'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='input2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <listen type='address' address='::0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <audio id='1' type='none'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <video>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model type='virtio' heads='1' primary='yes'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='video0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </video>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <watchdog model='itco' action='reset'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='watchdog0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </watchdog>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <memballoon model='virtio'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <stats period='10'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='balloon0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <rng model='virtio'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <backend model='random'>/dev/urandom</backend>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='rng0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <label>system_u:system_r:svirt_t:s0:c277,c815</label>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c815</imagelabel>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <label>+107:+107</label>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <imagelabel>+107:+107</imagelabel>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:08:01 compute-0 nova_compute[117514]: </domain>
Oct  8 19:08:01 compute-0 nova_compute[117514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.519 2 INFO nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully detached device tap6943627d-66 from instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 from the persistent domain config.#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.520 2 DEBUG nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] (1/8): Attempting to detach device tap6943627d-66 with device alias net1 from instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.520 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] detach device xml: <interface type="ethernet">
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <mac address="fa:16:3e:bc:a5:e4"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <model type="virtio"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <mtu size="1442"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <target dev="tap6943627d-66"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]: </interface>
Oct  8 19:08:01 compute-0 nova_compute[117514]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  8 19:08:01 compute-0 kernel: tap6943627d-66 (unregistering): left promiscuous mode
Oct  8 19:08:01 compute-0 NetworkManager[1035]: <info>  [1759950481.5881] device (tap6943627d-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.596 2 DEBUG nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Received event <DeviceRemovedEvent: 1759950481.5965085, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  8 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00058|binding|INFO|Releasing lport 6943627d-6614-41cb-9460-f0454c6defb1 from this chassis (sb_readonly=0)
Oct  8 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00059|binding|INFO|Setting lport 6943627d-6614-41cb-9460-f0454c6defb1 down in Southbound
Oct  8 19:08:01 compute-0 ovn_controller[19759]: 2025-10-08T19:08:01Z|00060|binding|INFO|Removing iface tap6943627d-66 ovn-installed in OVS
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.599 2 DEBUG nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Start waiting for the detach event from libvirt for device tap6943627d-66 with device alias net1 for instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.599 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.609 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bc:a5:e4 10.100.0.29'], port_security=['fa:16:3e:bc:a5:e4 10.100.0.29'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.29/28', 'neutron:device_id': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4df9aed3-d2c0-400e-9a01-f8aebdd77f61, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=6943627d-6614-41cb-9460-f0454c6defb1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.611 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 6943627d-6614-41cb-9460-f0454c6defb1 in datapath c73d9547-8a91-4802-82a8-1a3a035fe63c unbound from our chassis#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.606 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface>not found in domain: <domain type='kvm' id='3'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <name>instance-00000003</name>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:07:59</nova:creationTime>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:port uuid="6943627d-6614-41cb-9460-f0454c6defb1">
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.29" ipVersion="4"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:08:01 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <memory unit='KiB'>131072</memory>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <vcpu placement='static'>1</vcpu>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <resource>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <partition>/machine</partition>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </resource>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <sysinfo type='smbios'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <system>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='manufacturer'>RDO</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='product'>OpenStack Compute</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='serial'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='uuid'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <entry name='family'>Virtual Machine</entry>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </system>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <os>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <boot dev='hd'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <smbios mode='sysinfo'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </os>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <features>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <vmcoreinfo state='on'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </features>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <cpu mode='custom' match='exact' check='full'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <vendor>AMD</vendor>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='x2apic'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc-deadline'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='hypervisor'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc_adjust'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='spec-ctrl'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='stibp'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='arch-capabilities'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='ssbd'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='cmp_legacy'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='overflow-recov'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='succor'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='ibrs'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='amd-ssbd'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='virt-ssbd'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='lbrv'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='tsc-scale'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='vmcb-clean'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='flushbyasid'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='pause-filter'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='pfthreshold'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='svme-addr-chk'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='rdctl-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='mds-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='gds-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='rfds-no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='xsaves'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='svm'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='require' name='topoext'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='npt'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <feature policy='disable' name='nrip-save'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <clock offset='utc'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <timer name='pit' tickpolicy='delay'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <timer name='hpet' present='no'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <on_poweroff>destroy</on_poweroff>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <on_reboot>restart</on_reboot>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <on_crash>destroy</on_crash>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <disk type='file' device='disk'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk' index='2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <backingStore type='file' index='3'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:        <format type='raw'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:        <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:        <backingStore/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      </backingStore>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target dev='vda' bus='virtio'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='virtio-disk0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <disk type='file' device='cdrom'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <driver name='qemu' type='raw' cache='none'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config' index='1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <backingStore/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target dev='sda' bus='sata'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <readonly/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='sata0-0-0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='0' model='pcie-root'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pcie.0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='1' port='0x10'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='2' port='0x11'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='3' port='0x12'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.3'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='4' port='0x13'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='5' port='0x14'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.5'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='6' port='0x15'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.6'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='7' port='0x16'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.7'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='8' port='0x17'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.8'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='9' port='0x18'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.9'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='10' port='0x19'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.10'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='11' port='0x1a'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.11'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='12' port='0x1b'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.12'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='13' port='0x1c'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.13'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='14' port='0x1d'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.14'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='15' port='0x1e'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.15'/>
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.613 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c73d9547-8a91-4802-82a8-1a3a035fe63c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='16' port='0x1f'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.16'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='17' port='0x20'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.17'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='18' port='0x21'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.18'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='19' port='0x22'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.19'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='20' port='0x23'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.20'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='21' port='0x24'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.21'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='22' port='0x25'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.22'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='23' port='0x26'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.23'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='24' port='0x27'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.24'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target chassis='25' port='0x28'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.25'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model name='pcie-pci-bridge'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='pci.26'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='usb'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <controller type='sata' index='0'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='ide'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <interface type='ethernet'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <mac address='fa:16:3e:d7:63:9d'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target dev='tap0107be0e-1b'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model type='virtio'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <driver name='vhost' rx_queue_size='512'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <mtu size='1442'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='net0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <serial type='pty'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target type='isa-serial' port='0'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:        <model name='isa-serial'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      </target>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <console type='pty' tty='/dev/pts/0'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <target type='serial' port='0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </console>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <input type='tablet' bus='usb'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='input0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='usb' bus='0' port='1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <input type='mouse' bus='ps2'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='input1'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <input type='keyboard' bus='ps2'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='input2'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <listen type='address' address='::0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <audio id='1' type='none'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <video>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <model type='virtio' heads='1' primary='yes'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='video0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </video>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <watchdog model='itco' action='reset'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='watchdog0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </watchdog>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <memballoon model='virtio'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <stats period='10'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='balloon0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <rng model='virtio'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <backend model='random'>/dev/urandom</backend>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <alias name='rng0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <label>system_u:system_r:svirt_t:s0:c277,c815</label>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c815</imagelabel>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <label>+107:+107</label>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <imagelabel>+107:+107</imagelabel>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:08:01 compute-0 nova_compute[117514]: </domain>
Oct  8 19:08:01 compute-0 nova_compute[117514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.607 2 INFO nova.virt.libvirt.driver [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully detached device tap6943627d-66 from instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 from the live domain config.#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.608 2 DEBUG nova.virt.libvirt.vif [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.608 2 DEBUG nova.network.os_vif_util [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.608 2 DEBUG nova.network.os_vif_util [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.609 2 DEBUG os_vif [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.610 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6943627d-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.614 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[68fc4b23-3161-44cc-8087-3db7e67f3382]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.615 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c namespace which is not needed anymore#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.628 2 INFO os_vif [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66')#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.629 2 DEBUG nova.virt.libvirt.guest [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:08:01</nova:creationTime>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct  8 19:08:01 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:08:01 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:08:01 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:08:01 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:08:01 compute-0 nova_compute[117514]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [NOTICE]   (146141) : haproxy version is 2.8.14-c23fe91
Oct  8 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [NOTICE]   (146141) : path to executable is /usr/sbin/haproxy
Oct  8 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [WARNING]  (146141) : Exiting Master process...
Oct  8 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [WARNING]  (146141) : Exiting Master process...
Oct  8 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [ALERT]    (146141) : Current worker (146143) exited with code 143 (Terminated)
Oct  8 19:08:01 compute-0 neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c[146137]: [WARNING]  (146141) : All workers exited. Exiting... (0)
Oct  8 19:08:01 compute-0 systemd[1]: libpod-e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637.scope: Deactivated successfully.
Oct  8 19:08:01 compute-0 podman[146173]: 2025-10-08 19:08:01.777604952 +0000 UTC m=+0.042460126 container died e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  8 19:08:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637-userdata-shm.mount: Deactivated successfully.
Oct  8 19:08:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-274f69a4d16ffcd141ca0c2aabf5d247e94ba26d928301563c5c3be5cc17c132-merged.mount: Deactivated successfully.
Oct  8 19:08:01 compute-0 podman[146173]: 2025-10-08 19:08:01.814851161 +0000 UTC m=+0.079706345 container cleanup e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 19:08:01 compute-0 systemd[1]: libpod-conmon-e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637.scope: Deactivated successfully.
Oct  8 19:08:01 compute-0 podman[146211]: 2025-10-08 19:08:01.872341417 +0000 UTC m=+0.036351702 container remove e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.879 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[73371fd9-02e3-4551-a7fb-50b56767626c]: (4, ('Wed Oct  8 07:08:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c (e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637)\ne8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637\nWed Oct  8 07:08:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c (e8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637)\ne8c6faf61c8db69d560312908f590b9785d8039ff6bc8873fa27b59c76a83637\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.881 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[09382b5c-2f24-4a4d-8609-c77914f14cea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.883 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc73d9547-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:01 compute-0 kernel: tapc73d9547-80: left promiscuous mode
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:01 compute-0 nova_compute[117514]: 2025-10-08 19:08:01.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.911 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aedf8744-deff-404f-b1ad-a21929c862c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.938 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[7bef5564-83be-4906-89eb-4a45adde8950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.940 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae5991c-fefe-4d12-bdd4-a2c06f98f019]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.954 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f94b68-5b78-422f-8cd4-6a94bba88f3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 115031, 'reachable_time': 34555, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146226, 'error': None, 'target': 'ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.956 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c73d9547-8a91-4802-82a8-1a3a035fe63c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:08:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:01.956 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[bf558c68-a6d0-45cc-a64b-b5bcc502b6c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:01 compute-0 systemd[1]: run-netns-ovnmeta\x2dc73d9547\x2d8a91\x2d4802\x2d82a8\x2d1a3a035fe63c.mount: Deactivated successfully.
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.309 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.310 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.310 2 DEBUG nova.network.neutron [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.351 2 DEBUG nova.compute.manager [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-deleted-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.351 2 INFO nova.compute.manager [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Neutron deleted interface 6943627d-6614-41cb-9460-f0454c6defb1; detaching it from the instance and deleting it from the info cache#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.352 2 DEBUG nova.network.neutron [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.374 2 DEBUG nova.objects.instance [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lazy-loading 'system_metadata' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.404 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.405 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.406 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.406 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.406 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.406 2 WARNING nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.407 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-unplugged-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.407 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.407 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.408 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.408 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-unplugged-6943627d-6614-41cb-9460-f0454c6defb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.408 2 WARNING nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-unplugged-6943627d-6614-41cb-9460-f0454c6defb1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.408 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.409 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.409 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.409 2 DEBUG oslo_concurrency.lockutils [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.409 2 DEBUG nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.410 2 WARNING nova.compute.manager [req-7117c193-3715-4b74-80b2-4831a0067785 req-7ac74961-b315-47f6-9b35-dd563793e8df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-6943627d-6614-41cb-9460-f0454c6defb1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.414 2 DEBUG nova.objects.instance [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lazy-loading 'flavor' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.455 2 DEBUG nova.virt.libvirt.vif [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.455 2 DEBUG nova.network.os_vif_util [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.456 2 DEBUG nova.network.os_vif_util [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.460 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.467 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface>not found in domain: <domain type='kvm' id='3'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <name>instance-00000003</name>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:08:01</nova:creationTime>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:08:02 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <memory unit='KiB'>131072</memory>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <vcpu placement='static'>1</vcpu>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <resource>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <partition>/machine</partition>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </resource>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <sysinfo type='smbios'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <system>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='manufacturer'>RDO</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='product'>OpenStack Compute</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='serial'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='uuid'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='family'>Virtual Machine</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </system>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <os>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <boot dev='hd'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <smbios mode='sysinfo'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </os>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <features>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <vmcoreinfo state='on'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </features>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <cpu mode='custom' match='exact' check='full'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <vendor>AMD</vendor>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='x2apic'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc-deadline'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='hypervisor'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc_adjust'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='spec-ctrl'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='stibp'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='arch-capabilities'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='ssbd'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='cmp_legacy'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='overflow-recov'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='succor'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='ibrs'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='amd-ssbd'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='virt-ssbd'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='lbrv'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='tsc-scale'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='vmcb-clean'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='flushbyasid'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='pause-filter'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='pfthreshold'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='svme-addr-chk'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='rdctl-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='mds-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='gds-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='rfds-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='xsaves'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='svm'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='topoext'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='npt'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='nrip-save'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <clock offset='utc'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <timer name='pit' tickpolicy='delay'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <timer name='hpet' present='no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <on_poweroff>destroy</on_poweroff>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <on_reboot>restart</on_reboot>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <on_crash>destroy</on_crash>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <disk type='file' device='disk'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk' index='2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <backingStore type='file' index='3'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:        <format type='raw'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:        <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:        <backingStore/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      </backingStore>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target dev='vda' bus='virtio'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='virtio-disk0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <disk type='file' device='cdrom'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <driver name='qemu' type='raw' cache='none'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config' index='1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <backingStore/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target dev='sda' bus='sata'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <readonly/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='sata0-0-0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='0' model='pcie-root'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pcie.0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='1' port='0x10'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='2' port='0x11'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='3' port='0x12'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.3'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='4' port='0x13'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.4'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='5' port='0x14'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.5'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='6' port='0x15'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.6'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='7' port='0x16'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.7'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='8' port='0x17'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.8'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='9' port='0x18'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.9'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='10' port='0x19'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.10'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='11' port='0x1a'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.11'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='12' port='0x1b'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.12'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='13' port='0x1c'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.13'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='14' port='0x1d'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.14'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='15' port='0x1e'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.15'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='16' port='0x1f'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.16'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='17' port='0x20'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.17'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='18' port='0x21'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.18'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='19' port='0x22'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.19'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='20' port='0x23'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.20'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='21' port='0x24'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.21'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='22' port='0x25'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.22'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='23' port='0x26'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.23'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='24' port='0x27'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.24'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='25' port='0x28'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.25'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-pci-bridge'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.26'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='usb'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='sata' index='0'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='ide'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <interface type='ethernet'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <mac address='fa:16:3e:d7:63:9d'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target dev='tap0107be0e-1b'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model type='virtio'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <driver name='vhost' rx_queue_size='512'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <mtu size='1442'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='net0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <serial type='pty'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target type='isa-serial' port='0'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:        <model name='isa-serial'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      </target>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <console type='pty' tty='/dev/pts/0'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target type='serial' port='0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </console>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <input type='tablet' bus='usb'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='input0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='usb' bus='0' port='1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <input type='mouse' bus='ps2'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='input1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <input type='keyboard' bus='ps2'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='input2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <listen type='address' address='::0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <audio id='1' type='none'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <video>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model type='virtio' heads='1' primary='yes'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='video0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </video>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <watchdog model='itco' action='reset'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='watchdog0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </watchdog>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <memballoon model='virtio'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <stats period='10'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='balloon0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <rng model='virtio'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <backend model='random'>/dev/urandom</backend>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='rng0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <label>system_u:system_r:svirt_t:s0:c277,c815</label>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c815</imagelabel>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <label>+107:+107</label>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <imagelabel>+107:+107</imagelabel>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:08:02 compute-0 nova_compute[117514]: </domain>
Oct  8 19:08:02 compute-0 nova_compute[117514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.467 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.471 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bc:a5:e4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap6943627d-66"/></interface>not found in domain: <domain type='kvm' id='3'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <name>instance-00000003</name>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <uuid>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</uuid>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:08:01</nova:creationTime>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:08:02 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <memory unit='KiB'>131072</memory>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <vcpu placement='static'>1</vcpu>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <resource>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <partition>/machine</partition>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </resource>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <sysinfo type='smbios'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <system>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='manufacturer'>RDO</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='product'>OpenStack Compute</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='serial'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='uuid'>b66b330b-1cad-4dfb-a2f9-83201dc8ee32</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <entry name='family'>Virtual Machine</entry>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </system>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <os>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <boot dev='hd'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <smbios mode='sysinfo'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </os>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <features>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <vmcoreinfo state='on'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </features>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <cpu mode='custom' match='exact' check='full'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <vendor>AMD</vendor>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='x2apic'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc-deadline'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='hypervisor'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc_adjust'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='spec-ctrl'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='stibp'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='arch-capabilities'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='ssbd'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='cmp_legacy'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='overflow-recov'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='succor'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='ibrs'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='amd-ssbd'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='virt-ssbd'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='lbrv'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='tsc-scale'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='vmcb-clean'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='flushbyasid'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='pause-filter'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='pfthreshold'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='svme-addr-chk'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='rdctl-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='mds-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='gds-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='rfds-no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='xsaves'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='svm'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='require' name='topoext'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='npt'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <feature policy='disable' name='nrip-save'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <clock offset='utc'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <timer name='pit' tickpolicy='delay'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <timer name='hpet' present='no'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <on_poweroff>destroy</on_poweroff>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <on_reboot>restart</on_reboot>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <on_crash>destroy</on_crash>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <disk type='file' device='disk'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk' index='2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <backingStore type='file' index='3'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:        <format type='raw'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:        <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:        <backingStore/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      </backingStore>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target dev='vda' bus='virtio'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='virtio-disk0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <disk type='file' device='cdrom'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <driver name='qemu' type='raw' cache='none'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/disk.config' index='1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <backingStore/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target dev='sda' bus='sata'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <readonly/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='sata0-0-0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='0' model='pcie-root'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pcie.0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='1' port='0x10'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='2' port='0x11'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='3' port='0x12'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.3'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='4' port='0x13'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.4'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='5' port='0x14'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.5'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='6' port='0x15'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.6'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='7' port='0x16'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.7'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='8' port='0x17'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.8'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='9' port='0x18'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.9'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='10' port='0x19'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.10'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='11' port='0x1a'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.11'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='12' port='0x1b'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.12'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='13' port='0x1c'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.13'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='14' port='0x1d'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.14'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='15' port='0x1e'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.15'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='16' port='0x1f'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.16'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='17' port='0x20'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.17'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='18' port='0x21'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.18'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='19' port='0x22'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.19'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='20' port='0x23'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.20'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='21' port='0x24'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.21'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='22' port='0x25'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.22'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='23' port='0x26'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.23'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='24' port='0x27'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.24'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target chassis='25' port='0x28'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.25'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model name='pcie-pci-bridge'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='pci.26'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='usb'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <controller type='sata' index='0'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='ide'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <interface type='ethernet'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <mac address='fa:16:3e:d7:63:9d'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target dev='tap0107be0e-1b'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model type='virtio'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <driver name='vhost' rx_queue_size='512'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <mtu size='1442'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='net0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <serial type='pty'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target type='isa-serial' port='0'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:        <model name='isa-serial'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      </target>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <console type='pty' tty='/dev/pts/0'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32/console.log' append='off'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <target type='serial' port='0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </console>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <input type='tablet' bus='usb'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='input0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='usb' bus='0' port='1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <input type='mouse' bus='ps2'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='input1'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <input type='keyboard' bus='ps2'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='input2'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </input>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <listen type='address' address='::0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <audio id='1' type='none'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <video>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <model type='virtio' heads='1' primary='yes'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='video0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </video>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <watchdog model='itco' action='reset'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='watchdog0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </watchdog>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <memballoon model='virtio'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <stats period='10'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='balloon0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <rng model='virtio'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <backend model='random'>/dev/urandom</backend>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <alias name='rng0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <label>system_u:system_r:svirt_t:s0:c277,c815</label>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c277,c815</imagelabel>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <label>+107:+107</label>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <imagelabel>+107:+107</imagelabel>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:08:02 compute-0 nova_compute[117514]: </domain>
Oct  8 19:08:02 compute-0 nova_compute[117514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.472 2 WARNING nova.virt.libvirt.driver [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Detaching interface fa:16:3e:bc:a5:e4 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap6943627d-66' not found.#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.472 2 DEBUG nova.virt.libvirt.vif [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.473 2 DEBUG nova.network.os_vif_util [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Converting VIF {"id": "6943627d-6614-41cb-9460-f0454c6defb1", "address": "fa:16:3e:bc:a5:e4", "network": {"id": "c73d9547-8a91-4802-82a8-1a3a035fe63c", "bridge": "br-int", "label": "tempest-network-smoke--833981410", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.29", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6943627d-66", "ovs_interfaceid": "6943627d-6614-41cb-9460-f0454c6defb1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.473 2 DEBUG nova.network.os_vif_util [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.474 2 DEBUG os_vif [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.475 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6943627d-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.476 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.479 2 INFO os_vif [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bc:a5:e4,bridge_name='br-int',has_traffic_filtering=True,id=6943627d-6614-41cb-9460-f0454c6defb1,network=Network(c73d9547-8a91-4802-82a8-1a3a035fe63c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6943627d-66')#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.480 2 DEBUG nova.virt.libvirt.guest [req-f11bb41b-8d96-45c5-b57a-bd4afd2f953b req-72ef385b-bc24-4646-befc-6f649d5ccd73 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-602516393</nova:name>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:08:02</nova:creationTime>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    <nova:port uuid="0107be0e-1b4b-47dd-9422-a435ded0964c">
Oct  8 19:08:02 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:08:02 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:08:02 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:08:02 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:08:02 compute-0 nova_compute[117514]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 19:08:02 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:02.712 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:02 compute-0 nova_compute[117514]: 2025-10-08 19:08:02.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:03 compute-0 ovn_controller[19759]: 2025-10-08T19:08:03Z|00061|binding|INFO|Releasing lport b2172a75-691e-43ff-a242-3b06a5bfd197 from this chassis (sb_readonly=0)
Oct  8 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.609 2 INFO nova.network.neutron [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Port 6943627d-6614-41cb-9460-f0454c6defb1 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  8 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.610 2 DEBUG nova.network.neutron [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.626 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.659 2 DEBUG oslo_concurrency.lockutils [None req-b5d9e1b3-1782-421a-b1da-54f0443e12b0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-b66b330b-1cad-4dfb-a2f9-83201dc8ee32-6943627d-6614-41cb-9460-f0454c6defb1" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:03 compute-0 nova_compute[117514]: 2025-10-08 19:08:03.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.302 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.303 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.303 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.304 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.305 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.307 2 INFO nova.compute.manager [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Terminating instance#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.309 2 DEBUG nova.compute.manager [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:08:04 compute-0 kernel: tap0107be0e-1b (unregistering): left promiscuous mode
Oct  8 19:08:04 compute-0 NetworkManager[1035]: <info>  [1759950484.3369] device (tap0107be0e-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:08:04 compute-0 ovn_controller[19759]: 2025-10-08T19:08:04Z|00062|binding|INFO|Releasing lport 0107be0e-1b4b-47dd-9422-a435ded0964c from this chassis (sb_readonly=0)
Oct  8 19:08:04 compute-0 ovn_controller[19759]: 2025-10-08T19:08:04Z|00063|binding|INFO|Setting lport 0107be0e-1b4b-47dd-9422-a435ded0964c down in Southbound
Oct  8 19:08:04 compute-0 ovn_controller[19759]: 2025-10-08T19:08:04Z|00064|binding|INFO|Removing iface tap0107be0e-1b ovn-installed in OVS
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.353 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:63:9d 10.100.0.6'], port_security=['fa:16:3e:d7:63:9d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b66b330b-1cad-4dfb-a2f9-83201dc8ee32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15690acb-54cf-4081-a718-c14a1c0af6a8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '18c7314c-d74a-4643-933f-4dc6b05c33cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e9980b68-53e4-4dfd-a3d6-cbcaebcf011d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=0107be0e-1b4b-47dd-9422-a435ded0964c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.354 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 0107be0e-1b4b-47dd-9422-a435ded0964c in datapath 15690acb-54cf-4081-a718-c14a1c0af6a8 unbound from our chassis#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.355 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 15690acb-54cf-4081-a718-c14a1c0af6a8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.356 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccca8e8-6874-4fc2-b8a8-7480457d14be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.357 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 namespace which is not needed anymore#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct  8 19:08:04 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 14.532s CPU time.
Oct  8 19:08:04 compute-0 systemd-machined[77568]: Machine qemu-3-instance-00000003 terminated.
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.446 2 DEBUG nova.compute.manager [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.447 2 DEBUG nova.compute.manager [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing instance network info cache due to event network-changed-0107be0e-1b4b-47dd-9422-a435ded0964c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.448 2 DEBUG oslo_concurrency.lockutils [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.448 2 DEBUG oslo_concurrency.lockutils [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.449 2 DEBUG nova.network.neutron [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Refreshing network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [NOTICE]   (145844) : haproxy version is 2.8.14-c23fe91
Oct  8 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [NOTICE]   (145844) : path to executable is /usr/sbin/haproxy
Oct  8 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [WARNING]  (145844) : Exiting Master process...
Oct  8 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [ALERT]    (145844) : Current worker (145846) exited with code 143 (Terminated)
Oct  8 19:08:04 compute-0 neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8[145840]: [WARNING]  (145844) : All workers exited. Exiting... (0)
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 systemd[1]: libpod-f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c.scope: Deactivated successfully.
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 podman[146251]: 2025-10-08 19:08:04.541412234 +0000 UTC m=+0.064732265 container died f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:08:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c-userdata-shm.mount: Deactivated successfully.
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.576 2 INFO nova.virt.libvirt.driver [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Instance destroyed successfully.#033[00m
Oct  8 19:08:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-3dc6caed12c224c239b697cd06381493a291f904c3b4b3172b2f62f362bdce12-merged.mount: Deactivated successfully.
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.577 2 DEBUG nova.objects.instance [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid b66b330b-1cad-4dfb-a2f9-83201dc8ee32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.595 2 DEBUG nova.virt.libvirt.vif [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:07:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-602516393',display_name='tempest-TestNetworkBasicOps-server-602516393',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-602516393',id=3,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC1aTvBPkgf1VfNjvC4uuKKg+tISnXImijmvs2cGp+FtgeRrvxYh9lBxLRU9xSzH0Z6LaCabBaf6NwgK+eU8uEwumcvsX4qsd2EcbV6VjIknh+8LBbcMTdQeQSSFJx6qhQ==',key_name='tempest-TestNetworkBasicOps-1029193278',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:07:32Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-dwebwbaf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:07:32Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b66b330b-1cad-4dfb-a2f9-83201dc8ee32,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.596 2 DEBUG nova.network.os_vif_util [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.596 2 DEBUG nova.network.os_vif_util [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.597 2 DEBUG os_vif [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.599 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0107be0e-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.606 2 INFO os_vif [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:63:9d,bridge_name='br-int',has_traffic_filtering=True,id=0107be0e-1b4b-47dd-9422-a435ded0964c,network=Network(15690acb-54cf-4081-a718-c14a1c0af6a8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0107be0e-1b')#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.607 2 INFO nova.virt.libvirt.driver [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Deleting instance files /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32_del#033[00m
Oct  8 19:08:04 compute-0 podman[146251]: 2025-10-08 19:08:04.60755249 +0000 UTC m=+0.130872531 container cleanup f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.608 2 INFO nova.virt.libvirt.driver [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Deletion of /var/lib/nova/instances/b66b330b-1cad-4dfb-a2f9-83201dc8ee32_del complete#033[00m
Oct  8 19:08:04 compute-0 systemd[1]: libpod-conmon-f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c.scope: Deactivated successfully.
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.662 2 INFO nova.compute.manager [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.662 2 DEBUG oslo.service.loopingcall [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.663 2 DEBUG nova.compute.manager [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.663 2 DEBUG nova.network.neutron [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:08:04 compute-0 podman[146298]: 2025-10-08 19:08:04.675035896 +0000 UTC m=+0.044570349 container remove f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.682 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[8595acef-2323-4c41-a767-ef6219114b33]: (4, ('Wed Oct  8 07:08:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 (f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c)\nf1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c\nWed Oct  8 07:08:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 (f1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c)\nf1a0117811421542b95673dae027a361c998bc57c3bcbb56c41602c72d45d71c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.683 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1115fbbd-5af1-4285-ae5e-25479fbf8592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.684 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15690acb-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 kernel: tap15690acb-50: left promiscuous mode
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.690 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1abf3e45-c32a-499d-bd70-6dd0a1a409d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.712 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.715 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[eb067c56-ca49-45c3-8382-ea3392983c35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.716 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[238d5399-05f5-47b9-8d0a-b58a19fa3228]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.736 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.736 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.741 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[33dd0ff9-ae89-48d0-9230-fee19e264ae7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 112111, 'reachable_time': 33419, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146313, 'error': None, 'target': 'ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.743 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-15690acb-54cf-4081-a718-c14a1c0af6a8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:08:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:04.743 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[52f196b6-fd66-4e57-994c-f46511ad677c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d15690acb\x2d54cf\x2d4081\x2da718\x2dc14a1c0af6a8.mount: Deactivated successfully.
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.763 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.763 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.763 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.763 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.947 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.948 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6100MB free_disk=73.4237289428711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.948 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:04 compute-0 nova_compute[117514]: 2025-10-08 19:08:04.949 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.013 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.014 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.014 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.065 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.079 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.101 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.102 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.339 2 DEBUG nova.network.neutron [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.356 2 INFO nova.compute.manager [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Took 0.69 seconds to deallocate network for instance.#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.397 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.397 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.453 2 DEBUG nova.compute.provider_tree [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.468 2 DEBUG nova.scheduler.client.report [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.488 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.512 2 INFO nova.scheduler.client.report [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance b66b330b-1cad-4dfb-a2f9-83201dc8ee32#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.559 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.577 2 DEBUG oslo_concurrency.lockutils [None req-ed0b09e8-01cd-404b-978b-ec0387d5388f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.274s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.662 2 DEBUG nova.network.neutron [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updated VIF entry in instance network info cache for port 0107be0e-1b4b-47dd-9422-a435ded0964c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.663 2 DEBUG nova.network.neutron [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Updating instance_info_cache with network_info: [{"id": "0107be0e-1b4b-47dd-9422-a435ded0964c", "address": "fa:16:3e:d7:63:9d", "network": {"id": "15690acb-54cf-4081-a718-c14a1c0af6a8", "bridge": "br-int", "label": "tempest-network-smoke--977169033", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0107be0e-1b", "ovs_interfaceid": "0107be0e-1b4b-47dd-9422-a435ded0964c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:05 compute-0 nova_compute[117514]: 2025-10-08 19:08:05.682 2 DEBUG oslo_concurrency.lockutils [req-0d9e9642-a7dc-4aaa-87b5-79dd9358d42f req-495efbe4-c5c9-4f3e-80d5-5189ce5e2366 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b66b330b-1cad-4dfb-a2f9-83201dc8ee32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.082 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.083 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.526 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-unplugged-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.527 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.528 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.528 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.528 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-unplugged-0107be0e-1b4b-47dd-9422-a435ded0964c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.529 2 WARNING nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-unplugged-0107be0e-1b4b-47dd-9422-a435ded0964c for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.529 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.530 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.530 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.530 2 DEBUG oslo_concurrency.lockutils [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b66b330b-1cad-4dfb-a2f9-83201dc8ee32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.531 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] No waiting events found dispatching network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.531 2 WARNING nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received unexpected event network-vif-plugged-0107be0e-1b4b-47dd-9422-a435ded0964c for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.531 2 DEBUG nova.compute.manager [req-466249e3-5d03-4c5c-88bd-c4c5dffb3cec req-bf626d42-a0c6-4fc3-a323-16ffa38d7598 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Received event network-vif-deleted-0107be0e-1b4b-47dd-9422-a435ded0964c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:06 compute-0 nova_compute[117514]: 2025-10-08 19:08:06.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:07 compute-0 podman[146315]: 2025-10-08 19:08:07.645217755 +0000 UTC m=+0.061874670 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 19:08:07 compute-0 nova_compute[117514]: 2025-10-08 19:08:07.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:08:07 compute-0 nova_compute[117514]: 2025-10-08 19:08:07.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:08 compute-0 nova_compute[117514]: 2025-10-08 19:08:08.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:08:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:08:09 compute-0 nova_compute[117514]: 2025-10-08 19:08:09.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:10 compute-0 nova_compute[117514]: 2025-10-08 19:08:10.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:14 compute-0 nova_compute[117514]: 2025-10-08 19:08:14.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:15 compute-0 nova_compute[117514]: 2025-10-08 19:08:15.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:17 compute-0 podman[146340]: 2025-10-08 19:08:17.65097625 +0000 UTC m=+0.070733406 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 19:08:19 compute-0 nova_compute[117514]: 2025-10-08 19:08:19.573 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950484.571772, b66b330b-1cad-4dfb-a2f9-83201dc8ee32 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:08:19 compute-0 nova_compute[117514]: 2025-10-08 19:08:19.573 2 INFO nova.compute.manager [-] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:08:19 compute-0 nova_compute[117514]: 2025-10-08 19:08:19.598 2 DEBUG nova.compute.manager [None req-a9d41b9d-dd59-4090-8238-125f25f63212 - - - - - -] [instance: b66b330b-1cad-4dfb-a2f9-83201dc8ee32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:08:19 compute-0 nova_compute[117514]: 2025-10-08 19:08:19.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:20 compute-0 nova_compute[117514]: 2025-10-08 19:08:20.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.349 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.349 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.364 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.437 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.438 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.450 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.451 2 INFO nova.compute.claims [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.552 2 DEBUG nova.compute.provider_tree [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.565 2 DEBUG nova.scheduler.client.report [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.586 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.586 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.648 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.649 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.670 2 INFO nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.692 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.784 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.786 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.786 2 INFO nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Creating image(s)#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.787 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.787 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.788 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.802 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.825 2 DEBUG nova.policy [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.868 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.869 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.870 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.881 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.936 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.937 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.985 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.986 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:22 compute-0 nova_compute[117514]: 2025-10-08 19:08:22.986 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.066 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.067 2 DEBUG nova.virt.disk.api [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.068 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.125 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.126 2 DEBUG nova.virt.disk.api [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.127 2 DEBUG nova.objects.instance [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid b81092db-79a9-4570-9579-4e100364515a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.142 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.143 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Ensure instance console log exists: /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.144 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.145 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:23 compute-0 nova_compute[117514]: 2025-10-08 19:08:23.145 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:23 compute-0 podman[146377]: 2025-10-08 19:08:23.674956617 +0000 UTC m=+0.084916961 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 19:08:23 compute-0 podman[146376]: 2025-10-08 19:08:23.686438452 +0000 UTC m=+0.096659154 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=edpm)
Oct  8 19:08:24 compute-0 nova_compute[117514]: 2025-10-08 19:08:24.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:25 compute-0 nova_compute[117514]: 2025-10-08 19:08:25.566 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Successfully created port: 4df96566-2548-47bc-bd48-095ff9ce5a25 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:08:25 compute-0 nova_compute[117514]: 2025-10-08 19:08:25.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.113 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Successfully updated port: 4df96566-2548-47bc-bd48-095ff9ce5a25 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.132 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.133 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.134 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.199 2 DEBUG nova.compute.manager [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.200 2 DEBUG nova.compute.manager [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing instance network info cache due to event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.200 2 DEBUG oslo_concurrency.lockutils [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:08:26 compute-0 nova_compute[117514]: 2025-10-08 19:08:26.255 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:08:27 compute-0 podman[146412]: 2025-10-08 19:08:27.667483669 +0000 UTC m=+0.072981863 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.084 2 DEBUG nova.network.neutron [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.102 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.102 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance network_info: |[{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.103 2 DEBUG oslo_concurrency.lockutils [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.103 2 DEBUG nova.network.neutron [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.109 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start _get_guest_xml network_info=[{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.117 2 WARNING nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.127 2 DEBUG nova.virt.libvirt.host [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.128 2 DEBUG nova.virt.libvirt.host [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.133 2 DEBUG nova.virt.libvirt.host [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.134 2 DEBUG nova.virt.libvirt.host [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.134 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.135 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.135 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.135 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.136 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.137 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.137 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.137 2 DEBUG nova.virt.hardware [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.140 2 DEBUG nova.virt.libvirt.vif [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1923358122',display_name='tempest-TestNetworkBasicOps-server-1923358122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1923358122',id=4,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLQLVJLI0B1DuDHRr0xZejVz519BcFo77SQm/iU8QOSD6bvHcTPIzjucvYocQDiXeDjzdepuMi6T99yqrAkyTWA86BuQoBq3ywvQZ7i+b1z4o3zuHDlJxNAK8zAsugXiSA==',key_name='tempest-TestNetworkBasicOps-993932891',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-bunw0mg3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:08:22Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b81092db-79a9-4570-9579-4e100364515a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.141 2 DEBUG nova.network.os_vif_util [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.141 2 DEBUG nova.network.os_vif_util [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.142 2 DEBUG nova.objects.instance [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid b81092db-79a9-4570-9579-4e100364515a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.185 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <uuid>b81092db-79a9-4570-9579-4e100364515a</uuid>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <name>instance-00000004</name>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-1923358122</nova:name>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:08:28</nova:creationTime>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:08:28 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:08:28 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:08:28 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:08:28 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:08:28 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:08:28 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:08:28 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:08:28 compute-0 nova_compute[117514]:        <nova:port uuid="4df96566-2548-47bc-bd48-095ff9ce5a25">
Oct  8 19:08:28 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <system>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <entry name="serial">b81092db-79a9-4570-9579-4e100364515a</entry>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <entry name="uuid">b81092db-79a9-4570-9579-4e100364515a</entry>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </system>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <os>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  </os>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <features>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  </features>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.config"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:f7:31:02"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <target dev="tap4df96566-25"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/console.log" append="off"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <video>
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </video>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:08:28 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:08:28 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:08:28 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:08:28 compute-0 nova_compute[117514]: </domain>
Oct  8 19:08:28 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.186 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Preparing to wait for external event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.186 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.186 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.187 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.187 2 DEBUG nova.virt.libvirt.vif [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1923358122',display_name='tempest-TestNetworkBasicOps-server-1923358122',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1923358122',id=4,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLQLVJLI0B1DuDHRr0xZejVz519BcFo77SQm/iU8QOSD6bvHcTPIzjucvYocQDiXeDjzdepuMi6T99yqrAkyTWA86BuQoBq3ywvQZ7i+b1z4o3zuHDlJxNAK8zAsugXiSA==',key_name='tempest-TestNetworkBasicOps-993932891',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-bunw0mg3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:08:22Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b81092db-79a9-4570-9579-4e100364515a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.188 2 DEBUG nova.network.os_vif_util [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.188 2 DEBUG nova.network.os_vif_util [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.189 2 DEBUG os_vif [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.190 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.192 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4df96566-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.193 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4df96566-25, col_values=(('external_ids', {'iface-id': '4df96566-2548-47bc-bd48-095ff9ce5a25', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:31:02', 'vm-uuid': 'b81092db-79a9-4570-9579-4e100364515a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:28 compute-0 NetworkManager[1035]: <info>  [1759950508.1954] manager: (tap4df96566-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.205 2 INFO os_vif [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25')#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.311 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.311 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.312 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:f7:31:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:08:28 compute-0 nova_compute[117514]: 2025-10-08 19:08:28.313 2 INFO nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Using config drive#033[00m
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.274 2 INFO nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Creating config drive at /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.config#033[00m
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.284 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpce7lpz0x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.424 2 DEBUG oslo_concurrency.processutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpce7lpz0x" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:29 compute-0 kernel: tap4df96566-25: entered promiscuous mode
Oct  8 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.5342] manager: (tap4df96566-25): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00065|binding|INFO|Claiming lport 4df96566-2548-47bc-bd48-095ff9ce5a25 for this chassis.
Oct  8 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00066|binding|INFO|4df96566-2548-47bc-bd48-095ff9ce5a25: Claiming fa:16:3e:f7:31:02 10.100.0.4
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.550 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:31:02 10.100.0.4'], port_security=['fa:16:3e:f7:31:02 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b81092db-79a9-4570-9579-4e100364515a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b3c14dd0-3cf2-41c1-9115-bc2ef0b741ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f7e04c-5c12-4776-b9f7-f4835ede26c3, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=4df96566-2548-47bc-bd48-095ff9ce5a25) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.552 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 4df96566-2548-47bc-bd48-095ff9ce5a25 in datapath 820a3a2e-47e5-4f6d-88d6-281476a31fb1 bound to our chassis#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.554 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 820a3a2e-47e5-4f6d-88d6-281476a31fb1#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.568 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[679f2536-32ff-4c7c-8956-b133be81e209]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.569 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap820a3a2e-41 in ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.571 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap820a3a2e-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.571 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[acfedef4-7d07-4662-ac0b-f5e0ff1d190c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.572 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[2821b07c-348a-42bd-90a6-051cf9d75300]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.585 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[31596fec-84c1-4bb5-9775-8f1ce7a72c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 systemd-udevd[146493]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:08:29 compute-0 systemd-machined[77568]: New machine qemu-4-instance-00000004.
Oct  8 19:08:29 compute-0 podman[146450]: 2025-10-08 19:08:29.61581587 +0000 UTC m=+0.094616772 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.616 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[638c5d6c-8051-4157-8585-eb895b47bb16]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.618 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:29 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Oct  8 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.6215] device (tap4df96566-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.6239] device (tap4df96566-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00067|binding|INFO|Setting lport 4df96566-2548-47bc-bd48-095ff9ce5a25 ovn-installed in OVS
Oct  8 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00068|binding|INFO|Setting lport 4df96566-2548-47bc-bd48-095ff9ce5a25 up in Southbound
Oct  8 19:08:29 compute-0 podman[146449]: 2025-10-08 19:08:29.62811415 +0000 UTC m=+0.102485919 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.654 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[1f0ba735-c1cc-47df-ab7b-6d6f40d4e39e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 systemd-udevd[146499]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.6606] manager: (tap820a3a2e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.660 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d6276589-fc95-4a01-988e-e1facce27afe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.700 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e416765f-36c8-4a1a-955a-caa8efa6290e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.705 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[dc8881ca-cc76-45e0-a057-f489cb993986]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.7350] device (tap820a3a2e-40): carrier: link connected
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.740 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[48b2f32c-767a-43ee-9cdc-2bb25f1b479e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.763 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6d47d620-ca0e-4a15-9517-e252d3611eaf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap820a3a2e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118031, 'reachable_time': 41703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146528, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.784 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[42bddc48-fc49-4b6a-8fb0-0efb0286b5b9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefc:c1bb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118031, 'tstamp': 118031}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146529, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.804 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[94ecda1a-b815-424d-ba24-f9788212cb6e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap820a3a2e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118031, 'reachable_time': 41703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 146530, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.845 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9924e97f-5376-42e6-b147-cbbc50597c24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.916 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[772abeb5-c0fa-431b-b3b2-19b67a8b3272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.918 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820a3a2e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.918 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.919 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap820a3a2e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:29 compute-0 kernel: tap820a3a2e-40: entered promiscuous mode
Oct  8 19:08:29 compute-0 NetworkManager[1035]: <info>  [1759950509.9230] manager: (tap820a3a2e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.925 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap820a3a2e-40, col_values=(('external_ids', {'iface-id': '9e4e54fa-32ec-4ece-b34d-e4e72c958a54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.928 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/820a3a2e-47e5-4f6d-88d6-281476a31fb1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/820a3a2e-47e5-4f6d-88d6-281476a31fb1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.929 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[678df671-a2ea-4078-9210-19e1b73a7859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.930 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-820a3a2e-47e5-4f6d-88d6-281476a31fb1
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/820a3a2e-47e5-4f6d-88d6-281476a31fb1.pid.haproxy
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID 820a3a2e-47e5-4f6d-88d6-281476a31fb1
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:08:29 compute-0 ovn_controller[19759]: 2025-10-08T19:08:29Z|00069|binding|INFO|Releasing lport 9e4e54fa-32ec-4ece-b34d-e4e72c958a54 from this chassis (sb_readonly=0)
Oct  8 19:08:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:29.931 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'env', 'PROCESS_TAG=haproxy-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/820a3a2e-47e5-4f6d-88d6-281476a31fb1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:08:29 compute-0 nova_compute[117514]: 2025-10-08 19:08:29.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.271 2 DEBUG nova.compute.manager [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.272 2 DEBUG oslo_concurrency.lockutils [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.272 2 DEBUG oslo_concurrency.lockutils [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.273 2 DEBUG oslo_concurrency.lockutils [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.273 2 DEBUG nova.compute.manager [req-1601b8ce-7163-4bc3-8183-8ba97f510514 req-d4bdb2b1-1a74-45ce-9a06-8b154c98720a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Processing event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.336 2 DEBUG nova.network.neutron [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updated VIF entry in instance network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.337 2 DEBUG nova.network.neutron [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.353 2 DEBUG oslo_concurrency.lockutils [req-8d578775-f424-4733-a45c-99b7475d10fa req-3b39912f-7708-4f89-9b82-95c2372b1b7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:08:30 compute-0 podman[146569]: 2025-10-08 19:08:30.393396691 +0000 UTC m=+0.110553571 container create fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:08:30 compute-0 podman[146569]: 2025-10-08 19:08:30.307542793 +0000 UTC m=+0.024699683 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:08:30 compute-0 systemd[1]: Started libpod-conmon-fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07.scope.
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.464 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950510.4635644, b81092db-79a9-4570-9579-4e100364515a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.466 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] VM Started (Lifecycle Event)#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.469 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.479 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:08:30 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.485 2 INFO nova.virt.libvirt.driver [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance spawned successfully.#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.485 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.489 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:08:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58ea3c57cf88dfce89603da495e87146b2bb91b27139ea83f58571fbcf3d370c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.493 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.506 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.507 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.508 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.508 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.509 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.510 2 DEBUG nova.virt.libvirt.driver [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.516 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.517 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950510.463702, b81092db-79a9-4570-9579-4e100364515a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.517 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:08:30 compute-0 podman[146582]: 2025-10-08 19:08:30.539286562 +0000 UTC m=+0.103328404 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 19:08:30 compute-0 podman[146569]: 2025-10-08 19:08:30.53954607 +0000 UTC m=+0.256703000 container init fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:08:30 compute-0 podman[146569]: 2025-10-08 19:08:30.545666844 +0000 UTC m=+0.262823714 container start fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.556 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.560 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950510.4742796, b81092db-79a9-4570-9579-4e100364515a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.560 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:08:30 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [NOTICE]   (146614) : New worker (146616) forked
Oct  8 19:08:30 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [NOTICE]   (146614) : Loading success.
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.578 2 INFO nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Took 7.79 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.579 2 DEBUG nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.580 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.585 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.616 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.634 2 INFO nova.compute.manager [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Took 8.23 seconds to build instance.#033[00m
Oct  8 19:08:30 compute-0 nova_compute[117514]: 2025-10-08 19:08:30.648 2 DEBUG oslo_concurrency.lockutils [None req-06570891-2d5b-4324-ab5e-aff43b51ff35 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.385 2 DEBUG nova.compute.manager [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.386 2 DEBUG oslo_concurrency.lockutils [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.386 2 DEBUG oslo_concurrency.lockutils [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.387 2 DEBUG oslo_concurrency.lockutils [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.387 2 DEBUG nova.compute.manager [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] No waiting events found dispatching network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:08:32 compute-0 nova_compute[117514]: 2025-10-08 19:08:32.388 2 WARNING nova.compute.manager [req-ea8b809c-a9ff-41ae-b67d-a79ae3eb998c req-fb490f23-3d26-4541-ae2e-7704eca0c9aa bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received unexpected event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:08:33 compute-0 nova_compute[117514]: 2025-10-08 19:08:33.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:34 compute-0 ovn_controller[19759]: 2025-10-08T19:08:34Z|00070|binding|INFO|Releasing lport 9e4e54fa-32ec-4ece-b34d-e4e72c958a54 from this chassis (sb_readonly=0)
Oct  8 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:34 compute-0 NetworkManager[1035]: <info>  [1759950514.3565] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Oct  8 19:08:34 compute-0 NetworkManager[1035]: <info>  [1759950514.3582] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct  8 19:08:34 compute-0 ovn_controller[19759]: 2025-10-08T19:08:34Z|00071|binding|INFO|Releasing lport 9e4e54fa-32ec-4ece-b34d-e4e72c958a54 from this chassis (sb_readonly=0)
Oct  8 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.784 2 DEBUG nova.compute.manager [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.785 2 DEBUG nova.compute.manager [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing instance network info cache due to event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.785 2 DEBUG oslo_concurrency.lockutils [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.786 2 DEBUG oslo_concurrency.lockutils [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:08:34 compute-0 nova_compute[117514]: 2025-10-08 19:08:34.786 2 DEBUG nova.network.neutron [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:08:35 compute-0 nova_compute[117514]: 2025-10-08 19:08:35.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:36 compute-0 nova_compute[117514]: 2025-10-08 19:08:36.088 2 DEBUG nova.network.neutron [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updated VIF entry in instance network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:08:36 compute-0 nova_compute[117514]: 2025-10-08 19:08:36.089 2 DEBUG nova.network.neutron [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:36 compute-0 nova_compute[117514]: 2025-10-08 19:08:36.108 2 DEBUG oslo_concurrency.lockutils [req-d2c83d13-68f2-4bb8-aae7-50db4a4e3a0a req-15308620-8bfe-4daf-8fe2-c133abc856cb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:08:38 compute-0 nova_compute[117514]: 2025-10-08 19:08:38.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:38 compute-0 podman[146626]: 2025-10-08 19:08:38.648777806 +0000 UTC m=+0.068493362 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 19:08:40 compute-0 nova_compute[117514]: 2025-10-08 19:08:40.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:41 compute-0 ovn_controller[19759]: 2025-10-08T19:08:41Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:31:02 10.100.0.4
Oct  8 19:08:41 compute-0 ovn_controller[19759]: 2025-10-08T19:08:41Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:31:02 10.100.0.4
Oct  8 19:08:43 compute-0 nova_compute[117514]: 2025-10-08 19:08:43.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:44.229 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:44.230 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:08:44.232 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:45 compute-0 nova_compute[117514]: 2025-10-08 19:08:45.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:47 compute-0 nova_compute[117514]: 2025-10-08 19:08:47.128 2 INFO nova.compute.manager [None req-8a926bd0-a0c3-4ef6-99b9-b743398bc5c0 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Get console output#033[00m
Oct  8 19:08:47 compute-0 nova_compute[117514]: 2025-10-08 19:08:47.134 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:08:48 compute-0 nova_compute[117514]: 2025-10-08 19:08:48.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:48 compute-0 podman[146664]: 2025-10-08 19:08:48.691261744 +0000 UTC m=+0.099073297 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Oct  8 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.399 2 DEBUG nova.compute.manager [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.400 2 DEBUG nova.compute.manager [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing instance network info cache due to event network-changed-4df96566-2548-47bc-bd48-095ff9ce5a25. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.400 2 DEBUG oslo_concurrency.lockutils [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.401 2 DEBUG oslo_concurrency.lockutils [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.401 2 DEBUG nova.network.neutron [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Refreshing network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:08:50 compute-0 nova_compute[117514]: 2025-10-08 19:08:50.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:52 compute-0 nova_compute[117514]: 2025-10-08 19:08:52.096 2 DEBUG nova.network.neutron [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updated VIF entry in instance network info cache for port 4df96566-2548-47bc-bd48-095ff9ce5a25. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:08:52 compute-0 nova_compute[117514]: 2025-10-08 19:08:52.096 2 DEBUG nova.network.neutron [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:08:52 compute-0 nova_compute[117514]: 2025-10-08 19:08:52.121 2 DEBUG oslo_concurrency.lockutils [req-4c9adc67-85d1-4b2e-8b2c-bd8ad6742b5d req-e9c4cd01-e14a-4c03-8053-919c35fee654 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:08:53 compute-0 nova_compute[117514]: 2025-10-08 19:08:53.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:54 compute-0 podman[146685]: 2025-10-08 19:08:54.656773657 +0000 UTC m=+0.070528253 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter)
Oct  8 19:08:54 compute-0 podman[146686]: 2025-10-08 19:08:54.663807282 +0000 UTC m=+0.071257903 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 19:08:55 compute-0 nova_compute[117514]: 2025-10-08 19:08:55.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.129 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.130 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.152 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.231 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.232 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.240 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.240 2 INFO nova.compute.claims [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.391 2 DEBUG nova.compute.provider_tree [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.411 2 DEBUG nova.scheduler.client.report [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.447 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.448 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.546 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.547 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.570 2 INFO nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.593 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:08:58 compute-0 podman[146725]: 2025-10-08 19:08:58.66721418 +0000 UTC m=+0.079004200 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.713 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.715 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.716 2 INFO nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Creating image(s)#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.717 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.717 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.718 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.743 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.834 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.835 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.836 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.859 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.941 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:58 compute-0 nova_compute[117514]: 2025-10-08 19:08:58.942 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.012 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk 1073741824" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.013 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.014 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.102 2 DEBUG nova.policy [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.105 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.106 2 DEBUG nova.virt.disk.api [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.106 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.167 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.169 2 DEBUG nova.virt.disk.api [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.169 2 DEBUG nova.objects.instance [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.184 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.184 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Ensure instance console log exists: /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.185 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.185 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:08:59 compute-0 nova_compute[117514]: 2025-10-08 19:08:59.185 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:00 compute-0 nova_compute[117514]: 2025-10-08 19:09:00.200 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Successfully created port: a70af23b-d9f3-4d3e-96da-692ae05ba88a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:09:00 compute-0 podman[146766]: 2025-10-08 19:09:00.641607535 +0000 UTC m=+0.059965964 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:09:00 compute-0 podman[146764]: 2025-10-08 19:09:00.646407525 +0000 UTC m=+0.065304860 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:09:00 compute-0 podman[146765]: 2025-10-08 19:09:00.671190409 +0000 UTC m=+0.095466991 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:09:00 compute-0 nova_compute[117514]: 2025-10-08 19:09:00.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:01.105 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:09:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:01.106 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.317 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Successfully updated port: a70af23b-d9f3-4d3e-96da-692ae05ba88a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.335 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.335 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.335 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.405 2 DEBUG nova.compute.manager [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-changed-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.406 2 DEBUG nova.compute.manager [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Refreshing instance network info cache due to event network-changed-a70af23b-d9f3-4d3e-96da-692ae05ba88a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.406 2 DEBUG oslo_concurrency.lockutils [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:09:01 compute-0 nova_compute[117514]: 2025-10-08 19:09:01.470 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.555 2 DEBUG nova.network.neutron [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updating instance_info_cache with network_info: [{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.572 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.573 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance network_info: |[{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.574 2 DEBUG oslo_concurrency.lockutils [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.574 2 DEBUG nova.network.neutron [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Refreshing network info cache for port a70af23b-d9f3-4d3e-96da-692ae05ba88a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.579 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start _get_guest_xml network_info=[{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.586 2 WARNING nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.592 2 DEBUG nova.virt.libvirt.host [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.593 2 DEBUG nova.virt.libvirt.host [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.598 2 DEBUG nova.virt.libvirt.host [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.598 2 DEBUG nova.virt.libvirt.host [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.599 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.600 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.600 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.601 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.601 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.602 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.602 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.602 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.603 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.603 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.604 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.604 2 DEBUG nova.virt.hardware [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.610 2 DEBUG nova.virt.libvirt.vif [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2036009954',display_name='tempest-TestNetworkBasicOps-server-2036009954',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2036009954',id=5,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIaQYsS3p/kqkwhPUSySAXSbxOdURRwycGYz8zG+mSEb7vM+V8TI/DnRVmOc+q/Hcp4ljBTmVN8Dn0Fwxkk8IhqlYVJKZ25JiPY8aDaNHw2HT5FEQjUWsRu8yiFEP7RRtA==',key_name='tempest-TestNetworkBasicOps-1573009253',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-4zycm9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:08:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5f1c7c12-d16a-4158-9af6-e40d7ad01f2e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.610 2 DEBUG nova.network.os_vif_util [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.611 2 DEBUG nova.network.os_vif_util [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.613 2 DEBUG nova.objects.instance [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.628 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <uuid>5f1c7c12-d16a-4158-9af6-e40d7ad01f2e</uuid>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <name>instance-00000005</name>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-2036009954</nova:name>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:09:02</nova:creationTime>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:09:02 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:09:02 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:09:02 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:09:02 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:09:02 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:09:02 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:09:02 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:09:02 compute-0 nova_compute[117514]:        <nova:port uuid="a70af23b-d9f3-4d3e-96da-692ae05ba88a">
Oct  8 19:09:02 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <system>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <entry name="serial">5f1c7c12-d16a-4158-9af6-e40d7ad01f2e</entry>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <entry name="uuid">5f1c7c12-d16a-4158-9af6-e40d7ad01f2e</entry>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </system>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <os>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  </os>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <features>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  </features>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.config"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:1c:aa:70"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <target dev="tapa70af23b-d9"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/console.log" append="off"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <video>
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </video>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:09:02 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:09:02 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:09:02 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:09:02 compute-0 nova_compute[117514]: </domain>
Oct  8 19:09:02 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.629 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Preparing to wait for external event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.630 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.631 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.631 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.632 2 DEBUG nova.virt.libvirt.vif [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2036009954',display_name='tempest-TestNetworkBasicOps-server-2036009954',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2036009954',id=5,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIaQYsS3p/kqkwhPUSySAXSbxOdURRwycGYz8zG+mSEb7vM+V8TI/DnRVmOc+q/Hcp4ljBTmVN8Dn0Fwxkk8IhqlYVJKZ25JiPY8aDaNHw2HT5FEQjUWsRu8yiFEP7RRtA==',key_name='tempest-TestNetworkBasicOps-1573009253',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-4zycm9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:08:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5f1c7c12-d16a-4158-9af6-e40d7ad01f2e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.633 2 DEBUG nova.network.os_vif_util [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.634 2 DEBUG nova.network.os_vif_util [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.636 2 DEBUG os_vif [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.637 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.638 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.643 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa70af23b-d9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.643 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa70af23b-d9, col_values=(('external_ids', {'iface-id': 'a70af23b-d9f3-4d3e-96da-692ae05ba88a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:aa:70', 'vm-uuid': '5f1c7c12-d16a-4158-9af6-e40d7ad01f2e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:02 compute-0 NetworkManager[1035]: <info>  [1759950542.6487] manager: (tapa70af23b-d9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.660 2 INFO os_vif [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9')#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.720 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.720 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.721 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:1c:aa:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:09:02 compute-0 nova_compute[117514]: 2025-10-08 19:09:02.721 2 INFO nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Using config drive#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.260 2 INFO nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Creating config drive at /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.config#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.269 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt2k8v3fd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.406 2 DEBUG oslo_concurrency.processutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt2k8v3fd" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:03 compute-0 kernel: tapa70af23b-d9: entered promiscuous mode
Oct  8 19:09:03 compute-0 NetworkManager[1035]: <info>  [1759950543.4873] manager: (tapa70af23b-d9): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Oct  8 19:09:03 compute-0 ovn_controller[19759]: 2025-10-08T19:09:03Z|00072|binding|INFO|Claiming lport a70af23b-d9f3-4d3e-96da-692ae05ba88a for this chassis.
Oct  8 19:09:03 compute-0 ovn_controller[19759]: 2025-10-08T19:09:03Z|00073|binding|INFO|a70af23b-d9f3-4d3e-96da-692ae05ba88a: Claiming fa:16:3e:1c:aa:70 10.100.0.9
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.500 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:aa:70 10.100.0.9'], port_security=['fa:16:3e:1c:aa:70 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5f1c7c12-d16a-4158-9af6-e40d7ad01f2e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '02048498-a771-4306-8e83-ef79600f50a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f7e04c-5c12-4776-b9f7-f4835ede26c3, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=a70af23b-d9f3-4d3e-96da-692ae05ba88a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.502 28643 INFO neutron.agent.ovn.metadata.agent [-] Port a70af23b-d9f3-4d3e-96da-692ae05ba88a in datapath 820a3a2e-47e5-4f6d-88d6-281476a31fb1 bound to our chassis#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.504 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 820a3a2e-47e5-4f6d-88d6-281476a31fb1#033[00m
Oct  8 19:09:03 compute-0 ovn_controller[19759]: 2025-10-08T19:09:03Z|00074|binding|INFO|Setting lport a70af23b-d9f3-4d3e-96da-692ae05ba88a ovn-installed in OVS
Oct  8 19:09:03 compute-0 ovn_controller[19759]: 2025-10-08T19:09:03Z|00075|binding|INFO|Setting lport a70af23b-d9f3-4d3e-96da-692ae05ba88a up in Southbound
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.525 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f091f6fb-9872-4ea4-88e6-ae832917372b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:03 compute-0 systemd-udevd[146848]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:09:03 compute-0 NetworkManager[1035]: <info>  [1759950543.5553] device (tapa70af23b-d9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:09:03 compute-0 systemd-machined[77568]: New machine qemu-5-instance-00000005.
Oct  8 19:09:03 compute-0 NetworkManager[1035]: <info>  [1759950543.5566] device (tapa70af23b-d9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.568 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[ddbb015d-92d0-427a-ade4-ca0de6f9ca3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:03 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.574 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d4475a19-d274-4c01-a139-91b796280b29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.608 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d441cfad-f7ff-404e-8b0f-227fa0afca04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.631 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[96e6100d-678e-4bf2-97a5-f6f79865a0e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap820a3a2e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118031, 'reachable_time': 41703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146862, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.652 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[106b205f-8f39-44e2-8ead-2e3239cf51d1]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap820a3a2e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118046, 'tstamp': 118046}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146864, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap820a3a2e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118049, 'tstamp': 118049}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146864, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.654 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820a3a2e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.658 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap820a3a2e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.658 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.659 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap820a3a2e-40, col_values=(('external_ids', {'iface-id': '9e4e54fa-32ec-4ece-b34d-e4e72c958a54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:03.659 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.705 2 DEBUG nova.compute.manager [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.706 2 DEBUG oslo_concurrency.lockutils [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.706 2 DEBUG oslo_concurrency.lockutils [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.706 2 DEBUG oslo_concurrency.lockutils [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.706 2 DEBUG nova.compute.manager [req-bb5bf8da-77e7-4702-8d9c-8067d5dbb988 req-5178622a-22b5-4a5a-9970-9ec1fcbd811b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Processing event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:03 compute-0 nova_compute[117514]: 2025-10-08 19:09:03.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:04 compute-0 nova_compute[117514]: 2025-10-08 19:09:04.124 2 DEBUG nova.network.neutron [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updated VIF entry in instance network info cache for port a70af23b-d9f3-4d3e-96da-692ae05ba88a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:09:04 compute-0 nova_compute[117514]: 2025-10-08 19:09:04.125 2 DEBUG nova.network.neutron [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updating instance_info_cache with network_info: [{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:09:04 compute-0 nova_compute[117514]: 2025-10-08 19:09:04.138 2 DEBUG oslo_concurrency.lockutils [req-01ada483-92b7-4585-8c70-768b45246a77 req-3c44a16d-ae04-4a35-9db1-7126826df1b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:09:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:05.109 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.288 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950545.288375, 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.289 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] VM Started (Lifecycle Event)#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.292 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.297 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.301 2 INFO nova.virt.libvirt.driver [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance spawned successfully.#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.302 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.309 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.313 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.325 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.325 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.326 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.327 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.328 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.329 2 DEBUG nova.virt.libvirt.driver [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.338 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.339 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950545.2894785, 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.339 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.368 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.374 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950545.2960498, 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.375 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.400 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.406 2 INFO nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Took 6.69 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.407 2 DEBUG nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.412 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.447 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.488 2 INFO nova.compute.manager [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Took 7.28 seconds to build instance.#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.506 2 DEBUG oslo_concurrency.lockutils [None req-92de31af-7688-44be-b780-203f15ec1955 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.749 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.750 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.750 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.751 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.790 2 DEBUG nova.compute.manager [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.791 2 DEBUG oslo_concurrency.lockutils [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.791 2 DEBUG oslo_concurrency.lockutils [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.792 2 DEBUG oslo_concurrency.lockutils [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.792 2 DEBUG nova.compute.manager [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] No waiting events found dispatching network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.793 2 WARNING nova.compute.manager [req-55246877-71c7-46b5-9836-319b4690b26d req-f280551e-d232-455f-9ccb-588eb5bdd096 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received unexpected event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a for instance with vm_state active and task_state None.#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.845 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.936 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:05 compute-0 nova_compute[117514]: 2025-10-08 19:09:05.937 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.006 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.011 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.078 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.079 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.166 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.355 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.357 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5908MB free_disk=73.38648223876953GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.358 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.358 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.437 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance b81092db-79a9-4570-9579-4e100364515a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.438 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.439 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.439 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.505 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.522 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.545 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:09:06 compute-0 nova_compute[117514]: 2025-10-08 19:09:06.546 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.543 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.544 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.545 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.545 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.706 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.706 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquired lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.706 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.707 2 DEBUG nova.objects.instance [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b81092db-79a9-4570-9579-4e100364515a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.857 2 DEBUG nova.compute.manager [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-changed-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.858 2 DEBUG nova.compute.manager [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Refreshing instance network info cache due to event network-changed-a70af23b-d9f3-4d3e-96da-692ae05ba88a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.859 2 DEBUG oslo_concurrency.lockutils [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.860 2 DEBUG oslo_concurrency.lockutils [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:09:07 compute-0 nova_compute[117514]: 2025-10-08 19:09:07.860 2 DEBUG nova.network.neutron [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Refreshing network info cache for port a70af23b-d9f3-4d3e-96da-692ae05ba88a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:09:09 compute-0 podman[146886]: 2025-10-08 19:09:09.696706394 +0000 UTC m=+0.105360650 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:09:10 compute-0 nova_compute[117514]: 2025-10-08 19:09:10.760 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.518 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [{"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.533 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Releasing lock "refresh_cache-b81092db-79a9-4570-9579-4e100364515a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.534 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.534 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.535 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.535 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.535 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.535 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.620 2 DEBUG nova.network.neutron [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updated VIF entry in instance network info cache for port a70af23b-d9f3-4d3e-96da-692ae05ba88a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.620 2 DEBUG nova.network.neutron [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updating instance_info_cache with network_info: [{"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:09:11 compute-0 nova_compute[117514]: 2025-10-08 19:09:11.639 2 DEBUG oslo_concurrency.lockutils [req-30f4ef18-5ee5-4a7c-9b63-d48047bfe8c5 req-01b95e2d-b09b-46f7-864e-36c4c34a8847 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:09:12 compute-0 nova_compute[117514]: 2025-10-08 19:09:12.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:12 compute-0 nova_compute[117514]: 2025-10-08 19:09:12.703 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:09:15 compute-0 nova_compute[117514]: 2025-10-08 19:09:15.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:17 compute-0 ovn_controller[19759]: 2025-10-08T19:09:17Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:aa:70 10.100.0.9
Oct  8 19:09:17 compute-0 ovn_controller[19759]: 2025-10-08T19:09:17Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:aa:70 10.100.0.9
Oct  8 19:09:17 compute-0 nova_compute[117514]: 2025-10-08 19:09:17.655 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:19 compute-0 podman[146923]: 2025-10-08 19:09:19.650331707 +0000 UTC m=+0.070257445 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:09:20 compute-0 nova_compute[117514]: 2025-10-08 19:09:20.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:22 compute-0 nova_compute[117514]: 2025-10-08 19:09:22.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:23 compute-0 nova_compute[117514]: 2025-10-08 19:09:23.833 2 INFO nova.compute.manager [None req-42783f33-f50f-4406-9f02-e4b4c7a44b55 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Get console output#033[00m
Oct  8 19:09:23 compute-0 nova_compute[117514]: 2025-10-08 19:09:23.839 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.187 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.187 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.188 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.188 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.188 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.190 2 INFO nova.compute.manager [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Terminating instance#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.191 2 DEBUG nova.compute.manager [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:09:24 compute-0 kernel: tapa70af23b-d9 (unregistering): left promiscuous mode
Oct  8 19:09:24 compute-0 NetworkManager[1035]: <info>  [1759950564.2263] device (tapa70af23b-d9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:09:24 compute-0 ovn_controller[19759]: 2025-10-08T19:09:24Z|00076|binding|INFO|Releasing lport a70af23b-d9f3-4d3e-96da-692ae05ba88a from this chassis (sb_readonly=0)
Oct  8 19:09:24 compute-0 ovn_controller[19759]: 2025-10-08T19:09:24Z|00077|binding|INFO|Setting lport a70af23b-d9f3-4d3e-96da-692ae05ba88a down in Southbound
Oct  8 19:09:24 compute-0 ovn_controller[19759]: 2025-10-08T19:09:24Z|00078|binding|INFO|Removing iface tapa70af23b-d9 ovn-installed in OVS
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.249 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:aa:70 10.100.0.9'], port_security=['fa:16:3e:1c:aa:70 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '5f1c7c12-d16a-4158-9af6-e40d7ad01f2e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '02048498-a771-4306-8e83-ef79600f50a0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.212'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f7e04c-5c12-4776-b9f7-f4835ede26c3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=a70af23b-d9f3-4d3e-96da-692ae05ba88a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.250 28643 INFO neutron.agent.ovn.metadata.agent [-] Port a70af23b-d9f3-4d3e-96da-692ae05ba88a in datapath 820a3a2e-47e5-4f6d-88d6-281476a31fb1 unbound from our chassis#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.251 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 820a3a2e-47e5-4f6d-88d6-281476a31fb1#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.267 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[38340c45-21a4-4974-8311-9077915692b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.288 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[38d8a6dd-f475-4afb-9f0c-c70bd48cf91c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.290 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[33b83c75-569c-4cb6-9eb1-1435c234431d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:24 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Oct  8 19:09:24 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 13.385s CPU time.
Oct  8 19:09:24 compute-0 systemd-machined[77568]: Machine qemu-5-instance-00000005 terminated.
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.310 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3921a2-7291-4820-b23d-6728c569e13d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.341 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9fc569-5ee7-454e-8a09-cf55f38e5184]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap820a3a2e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fc:c1:bb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118031, 'reachable_time': 41703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 146954, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.361 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e3f4a5-5ffc-4bc5-a2a4-d9455d6f5537]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap820a3a2e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118046, 'tstamp': 118046}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146955, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap820a3a2e-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 118049, 'tstamp': 118049}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 146955, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.363 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820a3a2e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.369 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap820a3a2e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.369 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.369 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap820a3a2e-40, col_values=(('external_ids', {'iface-id': '9e4e54fa-32ec-4ece-b34d-e4e72c958a54'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:24 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:24.369 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.470 2 INFO nova.virt.libvirt.driver [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Instance destroyed successfully.#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.470 2 DEBUG nova.objects.instance [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.490 2 DEBUG nova.virt.libvirt.vif [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:08:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2036009954',display_name='tempest-TestNetworkBasicOps-server-2036009954',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2036009954',id=5,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIaQYsS3p/kqkwhPUSySAXSbxOdURRwycGYz8zG+mSEb7vM+V8TI/DnRVmOc+q/Hcp4ljBTmVN8Dn0Fwxkk8IhqlYVJKZ25JiPY8aDaNHw2HT5FEQjUWsRu8yiFEP7RRtA==',key_name='tempest-TestNetworkBasicOps-1573009253',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:05Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-4zycm9c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:05Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5f1c7c12-d16a-4158-9af6-e40d7ad01f2e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.490 2 DEBUG nova.network.os_vif_util [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "address": "fa:16:3e:1c:aa:70", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.212", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa70af23b-d9", "ovs_interfaceid": "a70af23b-d9f3-4d3e-96da-692ae05ba88a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.491 2 DEBUG nova.network.os_vif_util [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.492 2 DEBUG os_vif [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.494 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa70af23b-d9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.500 2 INFO os_vif [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:aa:70,bridge_name='br-int',has_traffic_filtering=True,id=a70af23b-d9f3-4d3e-96da-692ae05ba88a,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa70af23b-d9')#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.501 2 INFO nova.virt.libvirt.driver [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Deleting instance files /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e_del#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.502 2 INFO nova.virt.libvirt.driver [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Deletion of /var/lib/nova/instances/5f1c7c12-d16a-4158-9af6-e40d7ad01f2e_del complete#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.552 2 INFO nova.compute.manager [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.553 2 DEBUG oslo.service.loopingcall [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.553 2 DEBUG nova.compute.manager [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:09:24 compute-0 nova_compute[117514]: 2025-10-08 19:09:24.553 2 DEBUG nova.network.neutron [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.316 2 DEBUG nova.compute.manager [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-unplugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.316 2 DEBUG oslo_concurrency.lockutils [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.317 2 DEBUG oslo_concurrency.lockutils [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.317 2 DEBUG oslo_concurrency.lockutils [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.318 2 DEBUG nova.compute.manager [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] No waiting events found dispatching network-vif-unplugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.319 2 DEBUG nova.compute.manager [req-b9d21aea-26e2-474d-bc90-eb2091a5bd31 req-f5f6fb9c-5c0c-4b13-aed2-7a6effe0b3a6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-unplugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 19:09:25 compute-0 podman[146979]: 2025-10-08 19:09:25.65175334 +0000 UTC m=+0.066717241 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  8 19:09:25 compute-0 podman[146978]: 2025-10-08 19:09:25.689347979 +0000 UTC m=+0.100684574 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter)
Oct  8 19:09:25 compute-0 nova_compute[117514]: 2025-10-08 19:09:25.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.266 2 DEBUG nova.network.neutron [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.283 2 INFO nova.compute.manager [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Took 2.73 seconds to deallocate network for instance.#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.342 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.343 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.352 2 DEBUG nova.compute.manager [req-7ac36a26-bfb8-46c7-8e37-412c308c1da5 req-79acea6d-79a9-4bfd-a053-7c8b09f59ffd bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-deleted-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.413 2 DEBUG nova.compute.manager [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 DEBUG oslo_concurrency.lockutils [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 DEBUG oslo_concurrency.lockutils [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 DEBUG oslo_concurrency.lockutils [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 DEBUG nova.compute.manager [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] No waiting events found dispatching network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.414 2 WARNING nova.compute.manager [req-733ea0c2-6dbc-4a08-95c0-ac5b7c146550 req-f4690235-3128-4f44-b446-d55f3509f0f2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Received unexpected event network-vif-plugged-a70af23b-d9f3-4d3e-96da-692ae05ba88a for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.420 2 DEBUG nova.compute.provider_tree [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.435 2 DEBUG nova.scheduler.client.report [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.457 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.481 2 INFO nova.scheduler.client.report [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e#033[00m
Oct  8 19:09:27 compute-0 nova_compute[117514]: 2025-10-08 19:09:27.559 2 DEBUG oslo_concurrency.lockutils [None req-29c45a55-6be9-4b5f-8df4-3c763fe14ce1 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5f1c7c12-d16a-4158-9af6-e40d7ad01f2e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:29 compute-0 podman[147018]: 2025-10-08 19:09:29.663832139 +0000 UTC m=+0.081529223 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.699 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.699 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.699 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.699 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.700 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.701 2 INFO nova.compute.manager [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Terminating instance#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.702 2 DEBUG nova.compute.manager [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:09:29 compute-0 kernel: tap4df96566-25 (unregistering): left promiscuous mode
Oct  8 19:09:29 compute-0 NetworkManager[1035]: <info>  [1759950569.7353] device (tap4df96566-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.745 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:29 compute-0 ovn_controller[19759]: 2025-10-08T19:09:29Z|00079|binding|INFO|Releasing lport 4df96566-2548-47bc-bd48-095ff9ce5a25 from this chassis (sb_readonly=0)
Oct  8 19:09:29 compute-0 ovn_controller[19759]: 2025-10-08T19:09:29Z|00080|binding|INFO|Setting lport 4df96566-2548-47bc-bd48-095ff9ce5a25 down in Southbound
Oct  8 19:09:29 compute-0 ovn_controller[19759]: 2025-10-08T19:09:29Z|00081|binding|INFO|Removing iface tap4df96566-25 ovn-installed in OVS
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.749 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.756 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:31:02 10.100.0.4'], port_security=['fa:16:3e:f7:31:02 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b81092db-79a9-4570-9579-4e100364515a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b3c14dd0-3cf2-41c1-9115-bc2ef0b741ba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b8f7e04c-5c12-4776-b9f7-f4835ede26c3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=4df96566-2548-47bc-bd48-095ff9ce5a25) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.758 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 4df96566-2548-47bc-bd48-095ff9ce5a25 in datapath 820a3a2e-47e5-4f6d-88d6-281476a31fb1 unbound from our chassis#033[00m
Oct  8 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.759 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 820a3a2e-47e5-4f6d-88d6-281476a31fb1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.761 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[2153d278-f662-4958-8a3e-19f8fded96a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:29.762 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 namespace which is not needed anymore#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:29 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Oct  8 19:09:29 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 16.423s CPU time.
Oct  8 19:09:29 compute-0 systemd-machined[77568]: Machine qemu-4-instance-00000004 terminated.
Oct  8 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [NOTICE]   (146614) : haproxy version is 2.8.14-c23fe91
Oct  8 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [NOTICE]   (146614) : path to executable is /usr/sbin/haproxy
Oct  8 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [WARNING]  (146614) : Exiting Master process...
Oct  8 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [ALERT]    (146614) : Current worker (146616) exited with code 143 (Terminated)
Oct  8 19:09:29 compute-0 neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1[146594]: [WARNING]  (146614) : All workers exited. Exiting... (0)
Oct  8 19:09:29 compute-0 systemd[1]: libpod-fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07.scope: Deactivated successfully.
Oct  8 19:09:29 compute-0 podman[147066]: 2025-10-08 19:09:29.958498842 +0000 UTC m=+0.068776461 container died fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.983 2 INFO nova.virt.libvirt.driver [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Instance destroyed successfully.#033[00m
Oct  8 19:09:29 compute-0 nova_compute[117514]: 2025-10-08 19:09:29.983 2 DEBUG nova.objects.instance [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid b81092db-79a9-4570-9579-4e100364515a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:09:29 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07-userdata-shm.mount: Deactivated successfully.
Oct  8 19:09:29 compute-0 systemd[1]: var-lib-containers-storage-overlay-58ea3c57cf88dfce89603da495e87146b2bb91b27139ea83f58571fbcf3d370c-merged.mount: Deactivated successfully.
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.001 2 DEBUG nova.virt.libvirt.vif [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:08:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1923358122',display_name='tempest-TestNetworkBasicOps-server-1923358122',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1923358122',id=4,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLQLVJLI0B1DuDHRr0xZejVz519BcFo77SQm/iU8QOSD6bvHcTPIzjucvYocQDiXeDjzdepuMi6T99yqrAkyTWA86BuQoBq3ywvQZ7i+b1z4o3zuHDlJxNAK8zAsugXiSA==',key_name='tempest-TestNetworkBasicOps-993932891',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:08:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-bunw0mg3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:08:30Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=b81092db-79a9-4570-9579-4e100364515a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.002 2 DEBUG nova.network.os_vif_util [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "4df96566-2548-47bc-bd48-095ff9ce5a25", "address": "fa:16:3e:f7:31:02", "network": {"id": "820a3a2e-47e5-4f6d-88d6-281476a31fb1", "bridge": "br-int", "label": "tempest-network-smoke--67383231", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4df96566-25", "ovs_interfaceid": "4df96566-2548-47bc-bd48-095ff9ce5a25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.003 2 DEBUG nova.network.os_vif_util [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.003 2 DEBUG os_vif [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.005 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4df96566-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.007 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:09:30 compute-0 podman[147066]: 2025-10-08 19:09:30.010810501 +0000 UTC m=+0.121088090 container cleanup fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.016 2 INFO os_vif [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f7:31:02,bridge_name='br-int',has_traffic_filtering=True,id=4df96566-2548-47bc-bd48-095ff9ce5a25,network=Network(820a3a2e-47e5-4f6d-88d6-281476a31fb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4df96566-25')#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.017 2 INFO nova.virt.libvirt.driver [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Deleting instance files /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a_del#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.018 2 INFO nova.virt.libvirt.driver [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Deletion of /var/lib/nova/instances/b81092db-79a9-4570-9579-4e100364515a_del complete#033[00m
Oct  8 19:09:30 compute-0 systemd[1]: libpod-conmon-fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07.scope: Deactivated successfully.
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.078 2 INFO nova.compute.manager [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.079 2 DEBUG oslo.service.loopingcall [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.079 2 DEBUG nova.compute.manager [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.079 2 DEBUG nova.network.neutron [-] [instance: b81092db-79a9-4570-9579-4e100364515a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:09:30 compute-0 podman[147119]: 2025-10-08 19:09:30.115505611 +0000 UTC m=+0.067805973 container remove fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.125 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b193fa4d-337e-4382-bc30-af9cf9275211]: (4, ('Wed Oct  8 07:09:29 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 (fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07)\nfb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07\nWed Oct  8 07:09:30 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 (fb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07)\nfb07793201da9ab1609a4f4565bfd293a68536af438cb3b77cd79566b9425f07\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.128 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ef8c121c-1584-49f2-83d2-759b49650937]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.129 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820a3a2e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:30 compute-0 kernel: tap820a3a2e-40: left promiscuous mode
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.162 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b9622e3a-9f98-4735-9712-02f5b16a4c31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.196 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[eb4ba426-eea9-40b7-9903-c9a9c61ef927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.198 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d5334e2b-88b3-49c1-b531-502570add548]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.224 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a191e736-c1ca-463e-9a70-edb7af8cdef7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 118022, 'reachable_time': 41150, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 147132, 'error': None, 'target': 'ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.226 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-820a3a2e-47e5-4f6d-88d6-281476a31fb1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:09:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:30.227 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[2cce2644-7475-4659-b20d-3d320ba47048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:30 compute-0 systemd[1]: run-netns-ovnmeta\x2d820a3a2e\x2d47e5\x2d4f6d\x2d88d6\x2d281476a31fb1.mount: Deactivated successfully.
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.268 2 DEBUG nova.compute.manager [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-unplugged-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.269 2 DEBUG oslo_concurrency.lockutils [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.273 2 DEBUG oslo_concurrency.lockutils [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.274 2 DEBUG oslo_concurrency.lockutils [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.274 2 DEBUG nova.compute.manager [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] No waiting events found dispatching network-vif-unplugged-4df96566-2548-47bc-bd48-095ff9ce5a25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.274 2 DEBUG nova.compute.manager [req-c97f6596-a163-415a-ad01-1e59ab466d22 req-5f35c814-d439-4682-8b3c-ad3a16926199 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-unplugged-4df96566-2548-47bc-bd48-095ff9ce5a25 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.787 2 DEBUG nova.network.neutron [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.809 2 INFO nova.compute.manager [-] [instance: b81092db-79a9-4570-9579-4e100364515a] Took 0.73 seconds to deallocate network for instance.#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.840 2 DEBUG nova.compute.manager [req-fe788ad6-489d-4f84-9c34-70273116173c req-63be88b9-3fd5-4055-aec9-3b7bb6ebf3d9 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-deleted-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.862 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.863 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.906 2 DEBUG nova.compute.provider_tree [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.930 2 DEBUG nova.scheduler.client.report [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.959 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:30 compute-0 nova_compute[117514]: 2025-10-08 19:09:30.996 2 INFO nova.scheduler.client.report [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance b81092db-79a9-4570-9579-4e100364515a#033[00m
Oct  8 19:09:31 compute-0 nova_compute[117514]: 2025-10-08 19:09:31.082 2 DEBUG oslo_concurrency.lockutils [None req-87400f29-3e78-4b7c-ac51-6279a9b9a271 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.383s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:31 compute-0 podman[147135]: 2025-10-08 19:09:31.673008681 +0000 UTC m=+0.079252618 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 19:09:31 compute-0 podman[147137]: 2025-10-08 19:09:31.68221381 +0000 UTC m=+0.079161455 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:09:31 compute-0 podman[147136]: 2025-10-08 19:09:31.756192012 +0000 UTC m=+0.153976581 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  8 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.349 2 DEBUG nova.compute.manager [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 DEBUG oslo_concurrency.lockutils [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "b81092db-79a9-4570-9579-4e100364515a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 DEBUG oslo_concurrency.lockutils [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 DEBUG oslo_concurrency.lockutils [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "b81092db-79a9-4570-9579-4e100364515a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 DEBUG nova.compute.manager [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] No waiting events found dispatching network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:09:32 compute-0 nova_compute[117514]: 2025-10-08 19:09:32.350 2 WARNING nova.compute.manager [req-a744c4f7-4e52-43c7-9e58-dc8644ed9901 req-442b3477-e3a0-4072-9d2c-2ed9f0d101df bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: b81092db-79a9-4570-9579-4e100364515a] Received unexpected event network-vif-plugged-4df96566-2548-47bc-bd48-095ff9ce5a25 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:09:35 compute-0 nova_compute[117514]: 2025-10-08 19:09:35.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:35 compute-0 nova_compute[117514]: 2025-10-08 19:09:35.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:36 compute-0 nova_compute[117514]: 2025-10-08 19:09:36.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:36 compute-0 nova_compute[117514]: 2025-10-08 19:09:36.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:39 compute-0 nova_compute[117514]: 2025-10-08 19:09:39.468 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950564.466738, 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:09:39 compute-0 nova_compute[117514]: 2025-10-08 19:09:39.469 2 INFO nova.compute.manager [-] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:09:39 compute-0 nova_compute[117514]: 2025-10-08 19:09:39.487 2 DEBUG nova.compute.manager [None req-8fd81e35-7b7d-4fc4-9e97-119efa4e6095 - - - - - -] [instance: 5f1c7c12-d16a-4158-9af6-e40d7ad01f2e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:40 compute-0 nova_compute[117514]: 2025-10-08 19:09:40.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:40 compute-0 podman[147202]: 2025-10-08 19:09:40.645779945 +0000 UTC m=+0.061044233 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:09:40 compute-0 nova_compute[117514]: 2025-10-08 19:09:40.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:44.230 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:44.231 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:44.231 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:44 compute-0 nova_compute[117514]: 2025-10-08 19:09:44.982 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950569.9802787, b81092db-79a9-4570-9579-4e100364515a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:09:44 compute-0 nova_compute[117514]: 2025-10-08 19:09:44.983 2 INFO nova.compute.manager [-] [instance: b81092db-79a9-4570-9579-4e100364515a] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:09:45 compute-0 nova_compute[117514]: 2025-10-08 19:09:45.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:45 compute-0 nova_compute[117514]: 2025-10-08 19:09:45.038 2 DEBUG nova.compute.manager [None req-8e2b379e-2fb5-4935-9bae-f249390812b2 - - - - - -] [instance: b81092db-79a9-4570-9579-4e100364515a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:45 compute-0 nova_compute[117514]: 2025-10-08 19:09:45.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:49 compute-0 nova_compute[117514]: 2025-10-08 19:09:49.913 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:49 compute-0 nova_compute[117514]: 2025-10-08 19:09:49.914 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:49 compute-0 nova_compute[117514]: 2025-10-08 19:09:49.929 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.008 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.009 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.020 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.021 2 INFO nova.compute.claims [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.122 2 DEBUG nova.compute.provider_tree [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.138 2 DEBUG nova.scheduler.client.report [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.160 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.161 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.212 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.213 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.229 2 INFO nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.243 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.329 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.331 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.332 2 INFO nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Creating image(s)#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.333 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.333 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.335 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.360 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.447 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.448 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.449 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.466 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.542 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.543 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.640 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk 1073741824" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.642 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.643 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:50 compute-0 podman[147240]: 2025-10-08 19:09:50.677693593 +0000 UTC m=+0.104230197 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.732 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.733 2 DEBUG nova.virt.disk.api [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.733 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.786 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.788 2 DEBUG nova.virt.disk.api [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.788 2 DEBUG nova.objects.instance [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.806 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.806 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Ensure instance console log exists: /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.807 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.808 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.808 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:50 compute-0 nova_compute[117514]: 2025-10-08 19:09:50.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:51 compute-0 nova_compute[117514]: 2025-10-08 19:09:51.117 2 DEBUG nova.policy [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:09:53 compute-0 nova_compute[117514]: 2025-10-08 19:09:53.187 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Successfully created port: bfb32e9e-52b6-4043-b9a6-129d11fa2814 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.096 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Successfully updated port: bfb32e9e-52b6-4043-b9a6-129d11fa2814 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.126 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.126 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.126 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.211 2 DEBUG nova.compute.manager [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.212 2 DEBUG nova.compute.manager [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.212 2 DEBUG oslo_concurrency.lockutils [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.274 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:09:55 compute-0 nova_compute[117514]: 2025-10-08 19:09:55.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.387 2 DEBUG nova.network.neutron [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.408 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.409 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance network_info: |[{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.410 2 DEBUG oslo_concurrency.lockutils [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.410 2 DEBUG nova.network.neutron [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.416 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start _get_guest_xml network_info=[{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.425 2 WARNING nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.437 2 DEBUG nova.virt.libvirt.host [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.438 2 DEBUG nova.virt.libvirt.host [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.442 2 DEBUG nova.virt.libvirt.host [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.443 2 DEBUG nova.virt.libvirt.host [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.443 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.444 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.445 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.445 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.446 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.446 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.446 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.447 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.447 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.448 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.448 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.449 2 DEBUG nova.virt.hardware [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.455 2 DEBUG nova.virt.libvirt.vif [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:09:50Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.456 2 DEBUG nova.network.os_vif_util [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.457 2 DEBUG nova.network.os_vif_util [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.459 2 DEBUG nova.objects.instance [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.478 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <uuid>783f8889-2bc8-4641-bdb9-95ee4226a2fd</uuid>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <name>instance-00000006</name>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:09:56</nova:creationTime>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:09:56 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:09:56 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:09:56 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:09:56 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:09:56 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:09:56 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:09:56 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:09:56 compute-0 nova_compute[117514]:        <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct  8 19:09:56 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <system>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <entry name="serial">783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <entry name="uuid">783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </system>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <os>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  </os>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <features>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  </features>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:4e:85:2e"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <target dev="tapbfb32e9e-52"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log" append="off"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <video>
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </video>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:09:56 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:09:56 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:09:56 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:09:56 compute-0 nova_compute[117514]: </domain>
Oct  8 19:09:56 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.479 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Preparing to wait for external event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.480 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.480 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.481 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.481 2 DEBUG nova.virt.libvirt.vif [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:09:50Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.481 2 DEBUG nova.network.os_vif_util [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.482 2 DEBUG nova.network.os_vif_util [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.483 2 DEBUG os_vif [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.483 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfb32e9e-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbfb32e9e-52, col_values=(('external_ids', {'iface-id': 'bfb32e9e-52b6-4043-b9a6-129d11fa2814', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4e:85:2e', 'vm-uuid': '783f8889-2bc8-4641-bdb9-95ee4226a2fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:56 compute-0 NetworkManager[1035]: <info>  [1759950596.4930] manager: (tapbfb32e9e-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.505 2 INFO os_vif [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52')#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.566 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.568 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.568 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:4e:85:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:09:56 compute-0 nova_compute[117514]: 2025-10-08 19:09:56.568 2 INFO nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Using config drive#033[00m
Oct  8 19:09:56 compute-0 podman[147275]: 2025-10-08 19:09:56.611698285 +0000 UTC m=+0.066953108 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:09:56 compute-0 podman[147274]: 2025-10-08 19:09:56.616710371 +0000 UTC m=+0.077639480 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.158 2 INFO nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Creating config drive at /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.164 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb5eau1jo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.303 2 DEBUG oslo_concurrency.processutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb5eau1jo" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:09:57 compute-0 kernel: tapbfb32e9e-52: entered promiscuous mode
Oct  8 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.3985] manager: (tapbfb32e9e-52): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Oct  8 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00082|binding|INFO|Claiming lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 for this chassis.
Oct  8 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00083|binding|INFO|bfb32e9e-52b6-4043-b9a6-129d11fa2814: Claiming fa:16:3e:4e:85:2e 10.100.0.14
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.425 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:85:2e 10.100.0.14'], port_security=['fa:16:3e:4e:85:2e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e1f96720-345d-4fd7-8b5f-d68f6fe81454', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd3b59ed-5967-491c-a3b5-d0ba2b165b15, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=bfb32e9e-52b6-4043-b9a6-129d11fa2814) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.427 28643 INFO neutron.agent.ovn.metadata.agent [-] Port bfb32e9e-52b6-4043-b9a6-129d11fa2814 in datapath 0d073e98-c9f2-4b90-8237-84ff2fa99090 bound to our chassis#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.428 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0d073e98-c9f2-4b90-8237-84ff2fa99090#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.443 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf3e7a6-0947-4d9a-9bfc-ed3f89d6cce8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.444 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0d073e98-c1 in ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.447 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0d073e98-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.447 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fff2d7f4-3567-4889-8cbb-5ed7d6631111]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.448 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[43498475-7187-4f00-bc30-e6dfa1a7e492]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 systemd-machined[77568]: New machine qemu-6-instance-00000006.
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.463 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b90d76-492a-403b-84a1-427cbd5d293f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00084|binding|INFO|Setting lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 ovn-installed in OVS
Oct  8 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00085|binding|INFO|Setting lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 up in Southbound
Oct  8 19:09:57 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.493 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[369134e0-e2aa-4a5a-bc05-d553d086ef15]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 systemd-udevd[147333]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.5129] device (tapbfb32e9e-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.5138] device (tapbfb32e9e-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.536 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[b2125bbd-4b27-4bfb-b1f8-9416ebb6a7a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 systemd-udevd[147338]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.542 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce90c26-a586-4dd6-b703-bdc77f042301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.5440] manager: (tap0d073e98-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.589 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[a8dc1c17-3027-47cd-a63e-86046e3d4455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.595 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[28764abd-c22a-4f8c-b4f2-f87c1ce6a0e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.6280] device (tap0d073e98-c0): carrier: link connected
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.632 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[b3fbb484-5d1f-4196-b6df-bd35824ca717]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.656 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c96e4d6c-cb1d-453a-8b51-4f25d2909a73]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d073e98-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:56:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 126820, 'reachable_time': 32249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 147363, 'error': None, 'target': 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.680 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fb644dc3-1fd1-4770-8ebe-8d135148bd1d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:5643'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 126820, 'tstamp': 126820}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 147364, 'error': None, 'target': 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.697 2 DEBUG nova.network.neutron [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.698 2 DEBUG nova.network.neutron [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.703 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c401d904-f04b-44fa-b6ea-56f4e69362f2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0d073e98-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:56:43'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 126820, 'reachable_time': 32249, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 147365, 'error': None, 'target': 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.714 2 DEBUG oslo_concurrency.lockutils [req-dea58d1a-1d5c-4e74-962e-f87c0ba7c9fc req-449fc039-0678-4d47-b3f3-84001da0719d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.718 2 DEBUG nova.compute.manager [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.719 2 DEBUG oslo_concurrency.lockutils [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.720 2 DEBUG oslo_concurrency.lockutils [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.720 2 DEBUG oslo_concurrency.lockutils [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.721 2 DEBUG nova.compute.manager [req-65dcc135-4bee-4397-9880-e66e2f48a2ed req-5e19914d-c671-48e9-bd39-96df0317b38f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Processing event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.747 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[20e0fa64-f842-4e85-8e41-b607b34e325a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.825 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cb468035-7751-4b32-8102-53e4f19c23d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.827 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d073e98-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.827 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.828 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0d073e98-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:57 compute-0 kernel: tap0d073e98-c0: entered promiscuous mode
Oct  8 19:09:57 compute-0 NetworkManager[1035]: <info>  [1759950597.8312] manager: (tap0d073e98-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.835 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0d073e98-c0, col_values=(('external_ids', {'iface-id': 'ef1b5170-2d11-4e01-98e4-310f59c22ecd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:09:57 compute-0 ovn_controller[19759]: 2025-10-08T19:09:57Z|00086|binding|INFO|Releasing lport ef1b5170-2d11-4e01-98e4-310f59c22ecd from this chassis (sb_readonly=0)
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.838 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0d073e98-c9f2-4b90-8237-84ff2fa99090.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0d073e98-c9f2-4b90-8237-84ff2fa99090.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.839 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3aae28d3-ff3a-49e5-ab1b-2b15925d6ad8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.840 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-0d073e98-c9f2-4b90-8237-84ff2fa99090
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/0d073e98-c9f2-4b90-8237-84ff2fa99090.pid.haproxy
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID 0d073e98-c9f2-4b90-8237-84ff2fa99090
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:09:57 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:09:57.841 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'env', 'PROCESS_TAG=haproxy-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0d073e98-c9f2-4b90-8237-84ff2fa99090.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:09:57 compute-0 nova_compute[117514]: 2025-10-08 19:09:57.847 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:09:58 compute-0 podman[147404]: 2025-10-08 19:09:58.308804596 +0000 UTC m=+0.060509610 container create a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  8 19:09:58 compute-0 systemd[1]: Started libpod-conmon-a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc.scope.
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.344 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:09:58 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.347 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.348 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950598.3480842, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.348 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] VM Started (Lifecycle Event)#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.352 2 INFO nova.virt.libvirt.driver [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance spawned successfully.#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.353 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:09:58 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a1feba6530d5a46f34a3cb37ffae2c111c4760047000322a985a4db99d10005/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:09:58 compute-0 podman[147404]: 2025-10-08 19:09:58.273431132 +0000 UTC m=+0.025136196 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:09:58 compute-0 podman[147404]: 2025-10-08 19:09:58.370369475 +0000 UTC m=+0.122074499 container init a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 19:09:58 compute-0 podman[147404]: 2025-10-08 19:09:58.376654469 +0000 UTC m=+0.128359473 container start a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.389 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:58 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [NOTICE]   (147423) : New worker (147425) forked
Oct  8 19:09:58 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [NOTICE]   (147423) : Loading success.
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.398 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.401 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.402 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.405 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.405 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.406 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.406 2 DEBUG nova.virt.libvirt.driver [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.433 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.434 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950598.348189, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.435 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.472 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.475 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950598.3487449, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.476 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.484 2 INFO nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Took 8.15 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.485 2 DEBUG nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.496 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.499 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.527 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.537 2 INFO nova.compute.manager [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Took 8.57 seconds to build instance.#033[00m
Oct  8 19:09:58 compute-0 nova_compute[117514]: 2025-10-08 19:09:58.553 2 DEBUG oslo_concurrency.lockutils [None req-e45245f4-388d-425b-9e42-3cb049c08859 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.794 2 DEBUG nova.compute.manager [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.795 2 DEBUG oslo_concurrency.lockutils [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.796 2 DEBUG oslo_concurrency.lockutils [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.796 2 DEBUG oslo_concurrency.lockutils [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.796 2 DEBUG nova.compute.manager [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:09:59 compute-0 nova_compute[117514]: 2025-10-08 19:09:59.797 2 WARNING nova.compute.manager [req-8d271516-cd18-4a8b-9d12-f76234311223 req-2b10a661-3433-43cd-8f25-b8e17a5096fc bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:10:00 compute-0 ovn_controller[19759]: 2025-10-08T19:10:00Z|00087|binding|INFO|Releasing lport ef1b5170-2d11-4e01-98e4-310f59c22ecd from this chassis (sb_readonly=0)
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:00 compute-0 NetworkManager[1035]: <info>  [1759950600.5431] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Oct  8 19:10:00 compute-0 NetworkManager[1035]: <info>  [1759950600.5448] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:00 compute-0 ovn_controller[19759]: 2025-10-08T19:10:00Z|00088|binding|INFO|Releasing lport ef1b5170-2d11-4e01-98e4-310f59c22ecd from this chassis (sb_readonly=0)
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:00 compute-0 systemd[1]: Starting system activity accounting tool...
Oct  8 19:10:00 compute-0 systemd[1]: sysstat-collect.service: Deactivated successfully.
Oct  8 19:10:00 compute-0 systemd[1]: Finished system activity accounting tool.
Oct  8 19:10:00 compute-0 podman[147434]: 2025-10-08 19:10:00.72967979 +0000 UTC m=+0.103418974 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.853 2 DEBUG nova.compute.manager [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.853 2 DEBUG nova.compute.manager [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.854 2 DEBUG oslo_concurrency.lockutils [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.854 2 DEBUG oslo_concurrency.lockutils [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.854 2 DEBUG nova.network.neutron [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:10:00 compute-0 nova_compute[117514]: 2025-10-08 19:10:00.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:01 compute-0 nova_compute[117514]: 2025-10-08 19:10:01.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:01 compute-0 nova_compute[117514]: 2025-10-08 19:10:01.823 2 DEBUG nova.network.neutron [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:10:01 compute-0 nova_compute[117514]: 2025-10-08 19:10:01.824 2 DEBUG nova.network.neutron [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:10:01 compute-0 nova_compute[117514]: 2025-10-08 19:10:01.844 2 DEBUG oslo_concurrency.lockutils [req-8fa52e10-6370-4413-a2c8-fa754f0e87fd req-b2690757-1189-4b87-a6f6-9a61de74ec3a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:10:02 compute-0 podman[147465]: 2025-10-08 19:10:02.670523365 +0000 UTC m=+0.075830518 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:10:02 compute-0 podman[147463]: 2025-10-08 19:10:02.693827056 +0000 UTC m=+0.103180007 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 19:10:02 compute-0 podman[147464]: 2025-10-08 19:10:02.721777913 +0000 UTC m=+0.134537743 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 19:10:03 compute-0 nova_compute[117514]: 2025-10-08 19:10:03.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:05 compute-0 nova_compute[117514]: 2025-10-08 19:10:05.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:05 compute-0 nova_compute[117514]: 2025-10-08 19:10:05.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.751 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.751 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.752 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.843 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.942 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:10:06 compute-0 nova_compute[117514]: 2025-10-08 19:10:06.944 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.013 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.211 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.213 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5939MB free_disk=73.41488647460938GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.214 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.214 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.324 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.325 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.325 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.371 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.386 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.410 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:10:07 compute-0 nova_compute[117514]: 2025-10-08 19:10:07.410 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.196s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.245 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'name': 'tempest-TestNetworkBasicOps-server-1641480242', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'hostId': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.246 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.246 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>]
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.247 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.247 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>]
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.250 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 783f8889-2bc8-4641-bdb9-95ee4226a2fd / tapbfb32e9e-52 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.251 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59ac8332-5284-4251-972f-1fcbc96b74e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.247946', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67c690da-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '7e8d8a7d5331579e981382ddf35932b5ffe915addaad7ebe17b8049e1e1a3504'}]}, 'timestamp': '2025-10-08 19:10:08.252064', '_unique_id': 'acfb74cbd15a4b269c09aa037c2fed37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.253 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.278 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.279 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2077fb84-44d7-40d3-9132-df4fa0810708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.255145', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67cac5ce-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '1968980c403dd572fd25753bd6a4a54710bfaabb54944dc62fbababfce694128'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.255145', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67caeafe-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'af5f05d7cd411568017f5d9e07bb47a4ccafb57cb3f11950e30501858689c0a0'}]}, 'timestamp': '2025-10-08 19:10:08.280483', '_unique_id': 'ef279cfca85c4e05aa790d0d45332b43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.281 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.284 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.284 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>]
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.299 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.300 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f76c475f-9210-4a97-b3cc-7c09e8cc309f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.284925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67cdf802-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '8634c394ff11065505affffc95f8f57e2feca32876ed9fbd88db340811b8e052'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.284925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67ce0a40-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '73cea4d89a34139c517c37e24856f6f13c2024484abd57b532b474d3fd220810'}]}, 'timestamp': '2025-10-08 19:10:08.300956', '_unique_id': '6cf0648dcbfe449eac7c09609442e3e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.302 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.303 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.303 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.bytes volume: 23816192 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.304 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '262a7157-0ac5-4335-b6ca-df479a13f509', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23816192, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.303521', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67ce8236-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '9d814652e1a5ecc1926e7f8771a18bdaa8c8d1f55f7584d8d0b929874d16cd5f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.303521', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67ce9596-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'bb9d5f669b9194c9ff30e9df6f5880fc647a2561154ac907d67685a70f600a9f'}]}, 'timestamp': '2025-10-08 19:10:08.304479', '_unique_id': 'da72b37b5a5048d3ae68957b8f4c8412'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.305 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.307 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.307 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1641480242>]
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.307 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.333 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/memory.usage volume: 40.3671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f4b875f-fc7f-4e1c-8ea4-aeaff479318e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.3671875, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'timestamp': '2025-10-08T19:10:08.307579', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '67d32a52-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.962898662, 'message_signature': '8b6f7c50256cdb929db0c75abf109a1c42c75f1cd8a913bbc53d1fcc9f9b6165'}]}, 'timestamp': '2025-10-08 19:10:08.334576', '_unique_id': 'b57501b5c1b64164ac9edbefab90a5b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.336 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.337 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.337 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efa12b42-d049-43d7-90cc-f64326835aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.337655', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d3bb70-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': 'ec5bd9fdc80d350cda331c190485666f6b02060aac7ec23f51a7bb83e3d1555b'}]}, 'timestamp': '2025-10-08 19:10:08.338259', '_unique_id': 'bed1bacc2cef4037bd2776cc26fda8cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.339 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.341 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.341 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15b22c9d-f89d-47e6-95f3-5c062d87b84d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.341098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d44018-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': 'fd25c88e92ce428a01db97bda9a2800260a9dd16c7d883802cea258d844ff4a3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.341098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d45468-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '766c62586d92252b91ba97818e10eb0520efea3478c669bacb79c63c3b134886'}]}, 'timestamp': '2025-10-08 19:10:08.342170', '_unique_id': '564e3dc3a9ea46f3a8ed172ba2dd2e58'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.343 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.344 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c38427d-d1d6-4c54-aa4d-fd90439caf1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.344701', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d4cede-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '82aa521a4946745bbaf56e180e7b888555512fd5b78a29e3f1e2697a7ae78a79'}]}, 'timestamp': '2025-10-08 19:10:08.345305', '_unique_id': '10eddd7060ed4f4b891863c4de764885'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.346 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.347 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.348 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3ec50a4-bb4f-473c-a412-56ab305e7361', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.347764', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d544e0-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'be05585fe3909a7913092dbfea69b33f2a69f354c9e11cdf106beccbd99a5943'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.347764', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d55732-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '6d6da5ed6ad2c95cc0637593f364dacd30d873f25ac222632f1e08deba2acad7'}]}, 'timestamp': '2025-10-08 19:10:08.348760', '_unique_id': 'e9ab4fcc955446a89a71365cbea8bc17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.349 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.351 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd2eca84-88be-4e8b-b908-5a1dd10fe9bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.351323', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d5cdf2-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': 'ae43ec7baf3a21b0a38bc95755108c5b670b5434a182e8285f108b602d8df2e1'}]}, 'timestamp': '2025-10-08 19:10:08.351828', '_unique_id': 'f8092e3b04864e098c01ae36f233d602'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.354 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.354 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6104bb4-175a-493d-b877-0618a5874d7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.354269', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d6417e-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '9d0b519d2bec54f8e9b9698aa6d6a4ab7096d422fb9c4ce5f95f5360038b5d6d'}]}, 'timestamp': '2025-10-08 19:10:08.354817', '_unique_id': 'ee5ee5ec92e744218a955a4e829324ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.357 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f131d7ba-bb5f-4657-a0f2-1c758e9cd286', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.357545', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d6c5f4-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '02c467156e728730d7b36e857a556370ea63ac6343e654775b7c4c2b2cbdb5b3'}]}, 'timestamp': '2025-10-08 19:10:08.358203', '_unique_id': '056debf2c3aa4fd0956f9e32388c8254'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.359 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.362 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.363 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.latency volume: 471630043 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.363 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.latency volume: 2636697 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cda39725-6986-4cf1-a693-b1ee04724d13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 471630043, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.363268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d79f1a-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '3657ffc056deceba6ac420954edb7e16d797e858954b78f0508f17658e131d95'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2636697, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.363268', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d7aa64-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '7087526c63025fd99427088ce14526570ee6417316c7fa2f3e5ed6979058e4f6'}]}, 'timestamp': '2025-10-08 19:10:08.363933', '_unique_id': '914f21d9da714ec185d93af47ac6731c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.364 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.365 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.365 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c01d58e-8a10-4309-9853-13f19b96fb19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.365588', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d7f5d2-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '6293799f771740a27f35e335f34030eeea97972b354c751b600df438b1516045'}]}, 'timestamp': '2025-10-08 19:10:08.365830', '_unique_id': 'afa7813c34e44c07af560998d96bf7e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.366 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e97ff52e-8119-4e95-8660-4df76dc6c25a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.366960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d82b2e-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '1a9f66124ae09afc20ab885703c77702192820b0d46b6bbfacef391e780fd64c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.366960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d83344-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.914143417, 'message_signature': '7223313bf0682c50ba3d12bcdd5f3981814cc070b33fb62f2660d04ac7495591'}]}, 'timestamp': '2025-10-08 19:10:08.367382', '_unique_id': '4ec8a17cc55643648f19566188eb6caa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.367 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.368 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.368 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/cpu volume: 9220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7f6d6c6-0ec0-48e6-9bce-1f9c88a9572a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9220000000, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'timestamp': '2025-10-08T19:10:08.368457', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '67d86580-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.962898662, 'message_signature': '037f550e0e47e4ae8a27015d8886bdb8fbfb2ed3a37266a0882b763017060f34'}]}, 'timestamp': '2025-10-08 19:10:08.368682', '_unique_id': 'fe307d5da9b343c69ec556e5ba953d7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.requests volume: 770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.369 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d603454-6cf8-49d1-8512-6c59e249833f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 770, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.369723', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d896c2-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'be2ab941aaeed56d98b2451da57746c09fb9f552b5013cfbf1a468980a903193'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.369723', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d89f50-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'd6840f9f2dc274980c15dc9d1b58c229e9091e34981c6b70f1b6abe68deaa180'}]}, 'timestamp': '2025-10-08 19:10:08.370149', '_unique_id': '9f0dde9bb34f4af881daf59f1aa52ac5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.370 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47a6b627-fc5e-4fb5-8652-1513110f5374', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.371238', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d8d204-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '08651bd8d1094a3105cd4850ed5d07504701c5c1c4c69a710ff754f06fbe0a66'}]}, 'timestamp': '2025-10-08 19:10:08.371461', '_unique_id': '87dba970f3a74036b4c8677fd0696a0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.371 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.372 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.372 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '415e37a7-a946-42bd-b269-28f8a4c62d43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.372489', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d902e2-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': 'acffae1af86121058c849fc4c65e6666e7c67074728dfd06efe201a9da9224cf'}]}, 'timestamp': '2025-10-08 19:10:08.372712', '_unique_id': 'ee51f4c9b1194e6ca60025fbe5fa5023'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.373 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '517f211b-ed32-497f-b2ec-d1ffcb3e23f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-vda', 'timestamp': '2025-10-08T19:10:08.373739', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '67d9338e-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': '542216245f51a1271efc25227d94683acccf0707d58a0b4ab1a9a2c9869ac06f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd-sda', 'timestamp': '2025-10-08T19:10:08.373739', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'instance-00000006', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '67d93c3a-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.884361186, 'message_signature': 'b3222a8391f30d5307eb67567120fe721114191238020c244680afe52027fffc'}]}, 'timestamp': '2025-10-08 19:10:08.374165', '_unique_id': 'be1a80ae88bc4814acff129e670026c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.374 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 DEBUG ceilometer.compute.pollsters [-] 783f8889-2bc8-4641-bdb9-95ee4226a2fd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc7ab9d5-19ff-404e-a6a5-f120317cb0be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-00000006-783f8889-2bc8-4641-bdb9-95ee4226a2fd-tapbfb32e9e-52', 'timestamp': '2025-10-08T19:10:08.375251', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1641480242', 'name': 'tapbfb32e9e-52', 'instance_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:4e:85:2e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapbfb32e9e-52'}, 'message_id': '67d96ebc-a47a-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1278.877164166, 'message_signature': '1070f57bc1646ac196e2e848dab3aead0fe3830b85c4671ae3215e174f35b1fd'}]}, 'timestamp': '2025-10-08 19:10:08.375473', '_unique_id': '6e628c5fa28f44f39f59615f90bad475'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:10:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:10:08.375 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.406 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.407 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.407 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.467 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.467 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.467 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:10:08 compute-0 nova_compute[117514]: 2025-10-08 19:10:08.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:09 compute-0 nova_compute[117514]: 2025-10-08 19:10:09.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:10:10 compute-0 nova_compute[117514]: 2025-10-08 19:10:10.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:11 compute-0 nova_compute[117514]: 2025-10-08 19:10:11.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:11 compute-0 podman[147547]: 2025-10-08 19:10:11.648720307 +0000 UTC m=+0.075476407 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:10:13 compute-0 ovn_controller[19759]: 2025-10-08T19:10:13Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4e:85:2e 10.100.0.14
Oct  8 19:10:13 compute-0 ovn_controller[19759]: 2025-10-08T19:10:13Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4e:85:2e 10.100.0.14
Oct  8 19:10:15 compute-0 nova_compute[117514]: 2025-10-08 19:10:15.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:16 compute-0 nova_compute[117514]: 2025-10-08 19:10:16.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:19 compute-0 nova_compute[117514]: 2025-10-08 19:10:19.314 2 INFO nova.compute.manager [None req-9d9e4250-7fda-4296-8a89-1c404f607141 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Get console output#033[00m
Oct  8 19:10:19 compute-0 nova_compute[117514]: 2025-10-08 19:10:19.321 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:10:20 compute-0 nova_compute[117514]: 2025-10-08 19:10:20.904 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:20 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:20.905 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:10:20 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:20.906 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:10:20 compute-0 nova_compute[117514]: 2025-10-08 19:10:20.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:21 compute-0 nova_compute[117514]: 2025-10-08 19:10:21.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:21 compute-0 podman[147574]: 2025-10-08 19:10:21.672313611 +0000 UTC m=+0.089306981 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:10:21 compute-0 nova_compute[117514]: 2025-10-08 19:10:21.937 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:21 compute-0 nova_compute[117514]: 2025-10-08 19:10:21.938 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:21 compute-0 nova_compute[117514]: 2025-10-08 19:10:21.939 2 DEBUG nova.objects.instance [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'flavor' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:10:23 compute-0 nova_compute[117514]: 2025-10-08 19:10:23.118 2 DEBUG nova.objects.instance [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_requests' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:10:23 compute-0 nova_compute[117514]: 2025-10-08 19:10:23.133 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:10:24 compute-0 nova_compute[117514]: 2025-10-08 19:10:24.158 2 DEBUG nova.policy [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:10:25 compute-0 nova_compute[117514]: 2025-10-08 19:10:25.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:26 compute-0 nova_compute[117514]: 2025-10-08 19:10:26.215 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Successfully created port: ea81e5cb-74ba-43da-a780-3f1f699fa0d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:10:26 compute-0 nova_compute[117514]: 2025-10-08 19:10:26.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.301 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Successfully updated port: ea81e5cb-74ba-43da-a780-3f1f699fa0d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.320 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.320 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.320 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.455 2 DEBUG nova.compute.manager [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.456 2 DEBUG nova.compute.manager [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-ea81e5cb-74ba-43da-a780-3f1f699fa0d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:10:27 compute-0 nova_compute[117514]: 2025-10-08 19:10:27.456 2 DEBUG oslo_concurrency.lockutils [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:10:27 compute-0 podman[147595]: 2025-10-08 19:10:27.66197255 +0000 UTC m=+0.073628133 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 19:10:27 compute-0 podman[147594]: 2025-10-08 19:10:27.675004011 +0000 UTC m=+0.083904344 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 19:10:28 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:28.909 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.139 2 DEBUG nova.network.neutron [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.162 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.164 2 DEBUG oslo_concurrency.lockutils [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.164 2 DEBUG nova.network.neutron [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.169 2 DEBUG nova.virt.libvirt.vif [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.169 2 DEBUG nova.network.os_vif_util [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.170 2 DEBUG nova.network.os_vif_util [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.171 2 DEBUG os_vif [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.173 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.177 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea81e5cb-74, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.178 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea81e5cb-74, col_values=(('external_ids', {'iface-id': 'ea81e5cb-74ba-43da-a780-3f1f699fa0d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:ac:ba', 'vm-uuid': '783f8889-2bc8-4641-bdb9-95ee4226a2fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.1815] manager: (tapea81e5cb-74): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.189 2 INFO os_vif [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74')#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.191 2 DEBUG nova.virt.libvirt.vif [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.191 2 DEBUG nova.network.os_vif_util [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.192 2 DEBUG nova.network.os_vif_util [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.195 2 DEBUG nova.virt.libvirt.guest [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] attach device xml: <interface type="ethernet">
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <mac address="fa:16:3e:11:ac:ba"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <model type="virtio"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <mtu size="1442"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <target dev="tapea81e5cb-74"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]: </interface>
Oct  8 19:10:30 compute-0 nova_compute[117514]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  8 19:10:30 compute-0 kernel: tapea81e5cb-74: entered promiscuous mode
Oct  8 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.2102] manager: (tapea81e5cb-74): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00089|binding|INFO|Claiming lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for this chassis.
Oct  8 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00090|binding|INFO|ea81e5cb-74ba-43da-a780-3f1f699fa0d6: Claiming fa:16:3e:11:ac:ba 10.100.0.22
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.219 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:ac:ba 10.100.0.22'], port_security=['fa:16:3e:11:ac:ba 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bc67bd-dd21-4701-b445-33eb52179602', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=325bd26c-56bb-4683-8b62-92cc8f266207, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=ea81e5cb-74ba-43da-a780-3f1f699fa0d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.220 28643 INFO neutron.agent.ovn.metadata.agent [-] Port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 in datapath e3bc67bd-dd21-4701-b445-33eb52179602 bound to our chassis#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.222 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3bc67bd-dd21-4701-b445-33eb52179602#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.237 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a55737fd-65e8-4a8b-9130-f3f19e781f7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.238 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape3bc67bd-d1 in ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.242 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape3bc67bd-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.242 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c907d04f-59e0-4dcd-8867-bdef60d54469]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.244 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[55d87543-9be8-4a5c-840f-987357fff54e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 systemd-udevd[147643]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.257 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[80c8abc4-cf99-4fdd-8187-c723aafcc9a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00091|binding|INFO|Setting lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 ovn-installed in OVS
Oct  8 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00092|binding|INFO|Setting lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 up in Southbound
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.2752] device (tapea81e5cb-74): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.2780] device (tapea81e5cb-74): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.279 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[84627644-815c-4bec-a595-27a17d979b8a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.327 2 DEBUG nova.virt.libvirt.driver [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.326 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[6de15016-427b-47c0-9799-60e0799f1308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.327 2 DEBUG nova.virt.libvirt.driver [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.327 2 DEBUG nova.virt.libvirt.driver [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:4e:85:2e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.328 2 DEBUG nova.virt.libvirt.driver [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:11:ac:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.3350] manager: (tape3bc67bd-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.334 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[85a6f7ca-1c0a-4abd-ad98-a6eaf188ff4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.374 2 DEBUG nova.virt.libvirt.guest [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:10:30</nova:creationTime>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct  8 19:10:30 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    <nova:port uuid="ea81e5cb-74ba-43da-a780-3f1f699fa0d6">
Oct  8 19:10:30 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct  8 19:10:30 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:10:30 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:10:30 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:10:30 compute-0 nova_compute[117514]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.399 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[139d505c-4552-4f51-9b58-ddda5401b8d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.405 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[878301e3-c867-42b7-96ba-5c811cdd3cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.411 2 DEBUG oslo_concurrency.lockutils [None req-346e0911-941f-4fae-87bb-91c011a3b736 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 8.474s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.4390] device (tape3bc67bd-d0): carrier: link connected
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.444 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[95ff4254-4085-4d3d-afc3-2207a81ab8f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.472 2 DEBUG nova.compute.manager [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.472 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1a24a715-8530-4812-a83e-da8d6385f160]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bc67bd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:4f:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130101, 'reachable_time': 29029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 147668, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.473 2 DEBUG oslo_concurrency.lockutils [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.473 2 DEBUG oslo_concurrency.lockutils [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.474 2 DEBUG oslo_concurrency.lockutils [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.474 2 DEBUG nova.compute.manager [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.475 2 WARNING nova.compute.manager [req-e8678dfb-f035-4c70-83b4-67f3ac0735d2 req-0b4d8d3d-dc2b-4cf9-9a53-8fe6ca67fd4a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.497 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[72df0f75-e409-4fa8-ba9b-b44c8781df77]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:4fd1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130101, 'tstamp': 130101}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 147669, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.520 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0a4ab9b3-45fe-4b4b-aee4-e7da262d8712]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bc67bd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:4f:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130101, 'reachable_time': 29029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 147670, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.567 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b0941d90-61a0-4733-b1f2-a81d6a710edf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.659 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d001dad3-da35-4d79-8d31-2a38f447cb83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.662 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bc67bd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.663 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.663 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bc67bd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 NetworkManager[1035]: <info>  [1759950630.6671] manager: (tape3bc67bd-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Oct  8 19:10:30 compute-0 kernel: tape3bc67bd-d0: entered promiscuous mode
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.671 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3bc67bd-d0, col_values=(('external_ids', {'iface-id': 'd935682a-e42a-4970-b54c-b54c616cf798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 ovn_controller[19759]: 2025-10-08T19:10:30Z|00093|binding|INFO|Releasing lport d935682a-e42a-4970-b54c-b54c616cf798 from this chassis (sb_readonly=0)
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.689 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.692 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e3bc67bd-dd21-4701-b445-33eb52179602.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e3bc67bd-dd21-4701-b445-33eb52179602.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.693 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4c012b-71c5-45b3-8ab9-d7a96c426bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.694 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-e3bc67bd-dd21-4701-b445-33eb52179602
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/e3bc67bd-dd21-4701-b445-33eb52179602.pid.haproxy
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID e3bc67bd-dd21-4701-b445-33eb52179602
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:10:30 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:30.695 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'env', 'PROCESS_TAG=haproxy-e3bc67bd-dd21-4701-b445-33eb52179602', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e3bc67bd-dd21-4701-b445-33eb52179602.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:10:30 compute-0 nova_compute[117514]: 2025-10-08 19:10:30.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:31 compute-0 podman[147702]: 2025-10-08 19:10:31.138712833 +0000 UTC m=+0.075484598 container create 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  8 19:10:31 compute-0 systemd[1]: Started libpod-conmon-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40.scope.
Oct  8 19:10:31 compute-0 podman[147702]: 2025-10-08 19:10:31.099022383 +0000 UTC m=+0.035794188 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:10:31 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:10:31 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dbb48eabd75c06324d43161ee30ba9aa731d2058597c747932479a573242f471/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:10:31 compute-0 podman[147702]: 2025-10-08 19:10:31.217889697 +0000 UTC m=+0.154661462 container init 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:10:31 compute-0 podman[147702]: 2025-10-08 19:10:31.225014015 +0000 UTC m=+0.161785750 container start 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:10:31 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [NOTICE]   (147733) : New worker (147741) forked
Oct  8 19:10:31 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [NOTICE]   (147733) : Loading success.
Oct  8 19:10:31 compute-0 podman[147716]: 2025-10-08 19:10:31.259437071 +0000 UTC m=+0.079953688 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:10:31 compute-0 nova_compute[117514]: 2025-10-08 19:10:31.395 2 DEBUG nova.network.neutron [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port ea81e5cb-74ba-43da-a780-3f1f699fa0d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:10:31 compute-0 nova_compute[117514]: 2025-10-08 19:10:31.396 2 DEBUG nova.network.neutron [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:10:31 compute-0 ovn_controller[19759]: 2025-10-08T19:10:31Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:ac:ba 10.100.0.22
Oct  8 19:10:31 compute-0 ovn_controller[19759]: 2025-10-08T19:10:31Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:ac:ba 10.100.0.22
Oct  8 19:10:31 compute-0 nova_compute[117514]: 2025-10-08 19:10:31.412 2 DEBUG oslo_concurrency.lockutils [req-73b66b7a-2084-400a-be0c-e954fb22f600 req-887a3fc2-4baa-498d-8aa7-9eba7dcdd365 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.559 2 DEBUG nova.compute.manager [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.560 2 DEBUG oslo_concurrency.lockutils [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.560 2 DEBUG oslo_concurrency.lockutils [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.561 2 DEBUG oslo_concurrency.lockutils [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.561 2 DEBUG nova.compute.manager [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:10:32 compute-0 nova_compute[117514]: 2025-10-08 19:10:32.561 2 WARNING nova.compute.manager [req-9d8caff5-883f-4c82-864a-561afd01d1cc req-8eb2906a-7337-4758-a46f-d8188665bea0 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:10:33 compute-0 podman[147757]: 2025-10-08 19:10:33.674582698 +0000 UTC m=+0.069918645 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Oct  8 19:10:33 compute-0 podman[147755]: 2025-10-08 19:10:33.683730785 +0000 UTC m=+0.088765445 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  8 19:10:33 compute-0 podman[147756]: 2025-10-08 19:10:33.733898741 +0000 UTC m=+0.139698324 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  8 19:10:35 compute-0 nova_compute[117514]: 2025-10-08 19:10:35.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:35 compute-0 nova_compute[117514]: 2025-10-08 19:10:35.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:40 compute-0 nova_compute[117514]: 2025-10-08 19:10:40.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:40 compute-0 nova_compute[117514]: 2025-10-08 19:10:40.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:42 compute-0 podman[147817]: 2025-10-08 19:10:42.636963316 +0000 UTC m=+0.059150213 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:10:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:44.231 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:44.232 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:44.233 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.624 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.625 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.643 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.727 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.727 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.737 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.738 2 INFO nova.compute.claims [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.891 2 DEBUG nova.compute.provider_tree [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.907 2 DEBUG nova.scheduler.client.report [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.939 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.940 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.994 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:10:44 compute-0 nova_compute[117514]: 2025-10-08 19:10:44.995 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.014 2 INFO nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.035 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.123 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.125 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.126 2 INFO nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Creating image(s)#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.127 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.128 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.129 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.153 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.229 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.231 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.232 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.255 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.322 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.323 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.375 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.376 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.377 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.439 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.440 2 DEBUG nova.virt.disk.api [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.441 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.505 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.506 2 DEBUG nova.virt.disk.api [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.507 2 DEBUG nova.objects.instance [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 6af51230-93a7-45ef-9a1e-c47302f43bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.529 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.530 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Ensure instance console log exists: /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.531 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.531 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.532 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:45 compute-0 nova_compute[117514]: 2025-10-08 19:10:45.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:46 compute-0 nova_compute[117514]: 2025-10-08 19:10:46.137 2 DEBUG nova.policy [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:10:47 compute-0 nova_compute[117514]: 2025-10-08 19:10:47.719 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Successfully created port: 062b16e8-3c3b-4520-b0f8-536d588db2f5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.242 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Successfully updated port: 062b16e8-3c3b-4520-b0f8-536d588db2f5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.262 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.262 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.263 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.380 2 DEBUG nova.compute.manager [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-changed-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.381 2 DEBUG nova.compute.manager [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Refreshing instance network info cache due to event network-changed-062b16e8-3c3b-4520-b0f8-536d588db2f5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:10:49 compute-0 nova_compute[117514]: 2025-10-08 19:10:49.381 2 DEBUG oslo_concurrency.lockutils [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:10:50 compute-0 nova_compute[117514]: 2025-10-08 19:10:50.109 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:10:50 compute-0 nova_compute[117514]: 2025-10-08 19:10:50.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:50 compute-0 nova_compute[117514]: 2025-10-08 19:10:50.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.144 2 DEBUG nova.network.neutron [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Updating instance_info_cache with network_info: [{"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.165 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.166 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance network_info: |[{"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.167 2 DEBUG oslo_concurrency.lockutils [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.168 2 DEBUG nova.network.neutron [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Refreshing network info cache for port 062b16e8-3c3b-4520-b0f8-536d588db2f5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.173 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start _get_guest_xml network_info=[{"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.181 2 WARNING nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.193 2 DEBUG nova.virt.libvirt.host [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.194 2 DEBUG nova.virt.libvirt.host [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.205 2 DEBUG nova.virt.libvirt.host [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.206 2 DEBUG nova.virt.libvirt.host [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.207 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.208 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.208 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.209 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.209 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.210 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.210 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.211 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.211 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.212 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.213 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.213 2 DEBUG nova.virt.hardware [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.220 2 DEBUG nova.virt.libvirt.vif [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:10:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744393953',display_name='tempest-TestNetworkBasicOps-server-1744393953',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744393953',id=7,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/nGXxhsdJmWHabE3HFa5+3pmT1eGAFwd96u9XHC+whrqyLo5hIQAYJiUfXapQHjQsYnRIxe45Y0OXwPlQza5nnuSeUdl81Vlbahpy7snJ2RnOlPvASQfobelq2pqhHKA==',key_name='tempest-TestNetworkBasicOps-575443871',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-ke980fn8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:10:45Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=6af51230-93a7-45ef-9a1e-c47302f43bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.221 2 DEBUG nova.network.os_vif_util [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.222 2 DEBUG nova.network.os_vif_util [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.226 2 DEBUG nova.objects.instance [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6af51230-93a7-45ef-9a1e-c47302f43bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.242 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <uuid>6af51230-93a7-45ef-9a1e-c47302f43bcf</uuid>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <name>instance-00000007</name>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-1744393953</nova:name>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:10:52</nova:creationTime>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:10:52 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:10:52 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:10:52 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:10:52 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:10:52 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:10:52 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:10:52 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:10:52 compute-0 nova_compute[117514]:        <nova:port uuid="062b16e8-3c3b-4520-b0f8-536d588db2f5">
Oct  8 19:10:52 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <system>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <entry name="serial">6af51230-93a7-45ef-9a1e-c47302f43bcf</entry>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <entry name="uuid">6af51230-93a7-45ef-9a1e-c47302f43bcf</entry>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </system>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <os>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  </os>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <features>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  </features>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.config"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:49:be:8a"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <target dev="tap062b16e8-3c"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/console.log" append="off"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <video>
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </video>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:10:52 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:10:52 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:10:52 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:10:52 compute-0 nova_compute[117514]: </domain>
Oct  8 19:10:52 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.244 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Preparing to wait for external event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.244 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.245 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.246 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.247 2 DEBUG nova.virt.libvirt.vif [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:10:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744393953',display_name='tempest-TestNetworkBasicOps-server-1744393953',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744393953',id=7,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/nGXxhsdJmWHabE3HFa5+3pmT1eGAFwd96u9XHC+whrqyLo5hIQAYJiUfXapQHjQsYnRIxe45Y0OXwPlQza5nnuSeUdl81Vlbahpy7snJ2RnOlPvASQfobelq2pqhHKA==',key_name='tempest-TestNetworkBasicOps-575443871',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-ke980fn8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:10:45Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=6af51230-93a7-45ef-9a1e-c47302f43bcf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.248 2 DEBUG nova.network.os_vif_util [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.249 2 DEBUG nova.network.os_vif_util [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.250 2 DEBUG os_vif [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.252 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.253 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.258 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap062b16e8-3c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.259 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap062b16e8-3c, col_values=(('external_ids', {'iface-id': '062b16e8-3c3b-4520-b0f8-536d588db2f5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:be:8a', 'vm-uuid': '6af51230-93a7-45ef-9a1e-c47302f43bcf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:52 compute-0 NetworkManager[1035]: <info>  [1759950652.2632] manager: (tap062b16e8-3c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.273 2 INFO os_vif [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c')#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.322 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.323 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.324 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:49:be:8a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:10:52 compute-0 nova_compute[117514]: 2025-10-08 19:10:52.325 2 INFO nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Using config drive#033[00m
Oct  8 19:10:52 compute-0 podman[147858]: 2025-10-08 19:10:52.64848277 +0000 UTC m=+0.074502207 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Oct  8 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.254 2 INFO nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Creating config drive at /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.config#033[00m
Oct  8 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.259 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_e8qxmf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.395 2 DEBUG oslo_concurrency.processutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_e8qxmf" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:10:53 compute-0 NetworkManager[1035]: <info>  [1759950653.4797] manager: (tap062b16e8-3c): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Oct  8 19:10:53 compute-0 kernel: tap062b16e8-3c: entered promiscuous mode
Oct  8 19:10:53 compute-0 ovn_controller[19759]: 2025-10-08T19:10:53Z|00094|binding|INFO|Claiming lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 for this chassis.
Oct  8 19:10:53 compute-0 ovn_controller[19759]: 2025-10-08T19:10:53Z|00095|binding|INFO|062b16e8-3c3b-4520-b0f8-536d588db2f5: Claiming fa:16:3e:49:be:8a 10.100.0.24
Oct  8 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.497 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:be:8a 10.100.0.24'], port_security=['fa:16:3e:49:be:8a 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '6af51230-93a7-45ef-9a1e-c47302f43bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bc67bd-dd21-4701-b445-33eb52179602', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a7ebd8cf-2e32-494a-bac7-d2c7c2ffc36a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=325bd26c-56bb-4683-8b62-92cc8f266207, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=062b16e8-3c3b-4520-b0f8-536d588db2f5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:10:53 compute-0 ovn_controller[19759]: 2025-10-08T19:10:53Z|00096|binding|INFO|Setting lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 up in Southbound
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.500 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 062b16e8-3c3b-4520-b0f8-536d588db2f5 in datapath e3bc67bd-dd21-4701-b445-33eb52179602 bound to our chassis#033[00m
Oct  8 19:10:53 compute-0 ovn_controller[19759]: 2025-10-08T19:10:53Z|00097|binding|INFO|Setting lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 ovn-installed in OVS
Oct  8 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.504 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3bc67bd-dd21-4701-b445-33eb52179602#033[00m
Oct  8 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.526 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9707f6b2-8c19-4bb4-941c-815b93edbb1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:53 compute-0 systemd-udevd[147898]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:10:53 compute-0 systemd-machined[77568]: New machine qemu-7-instance-00000007.
Oct  8 19:10:53 compute-0 NetworkManager[1035]: <info>  [1759950653.5542] device (tap062b16e8-3c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:10:53 compute-0 NetworkManager[1035]: <info>  [1759950653.5553] device (tap062b16e8-3c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:10:53 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.561 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc8f78e-60a9-408b-af3f-750664ad33f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.565 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[8f715c55-3505-4576-b1ed-4ac8256d0186]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.602 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[6b1d07ed-a766-44b8-8f52-63fd41fc9346]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.631 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c5d870-8fc2-4993-ad2a-62c4292d398b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bc67bd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:4f:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130101, 'reachable_time': 29029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 147907, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.657 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cec0ecea-ed82-4475-824f-28a08b1b52c6]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape3bc67bd-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130118, 'tstamp': 130118}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 147910, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tape3bc67bd-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130123, 'tstamp': 130123}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 147910, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.659 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bc67bd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:53 compute-0 nova_compute[117514]: 2025-10-08 19:10:53.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.663 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bc67bd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.663 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.664 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3bc67bd-d0, col_values=(('external_ids', {'iface-id': 'd935682a-e42a-4970-b54c-b54c616cf798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:10:53 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:10:53.664 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.301 2 DEBUG nova.compute.manager [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.302 2 DEBUG oslo_concurrency.lockutils [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.302 2 DEBUG oslo_concurrency.lockutils [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.303 2 DEBUG oslo_concurrency.lockutils [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.303 2 DEBUG nova.compute.manager [req-57d1e0ce-bece-4a4c-a337-ce24b22dac81 req-b4b95a35-2872-4e05-b4ec-5278ea11752c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Processing event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.372 2 DEBUG nova.network.neutron [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Updated VIF entry in instance network info cache for port 062b16e8-3c3b-4520-b0f8-536d588db2f5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.373 2 DEBUG nova.network.neutron [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Updating instance_info_cache with network_info: [{"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.391 2 DEBUG oslo_concurrency.lockutils [req-64a2a93f-5a7b-44ee-9876-16e5bef090f3 req-ebe6251b-bdb6-4253-adb9-fd5abb039c97 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-6af51230-93a7-45ef-9a1e-c47302f43bcf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.849 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950654.848302, 6af51230-93a7-45ef-9a1e-c47302f43bcf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.849 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] VM Started (Lifecycle Event)#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.853 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.856 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.860 2 INFO nova.virt.libvirt.driver [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance spawned successfully.#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.861 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.882 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.893 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.901 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.902 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.903 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.904 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.905 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.906 2 DEBUG nova.virt.libvirt.driver [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.942 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.943 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950654.8484669, 6af51230-93a7-45ef-9a1e-c47302f43bcf => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.943 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.975 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.980 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950654.8558252, 6af51230-93a7-45ef-9a1e-c47302f43bcf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.981 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.989 2 INFO nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Took 9.86 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:10:54 compute-0 nova_compute[117514]: 2025-10-08 19:10:54.989 2 DEBUG nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.002 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.007 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.043 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.073 2 INFO nova.compute.manager [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Took 10.38 seconds to build instance.#033[00m
Oct  8 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.091 2 DEBUG oslo_concurrency.lockutils [None req-f9fc2eb8-2afe-4e77-b811-0c47e1f13313 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.466s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:55 compute-0 nova_compute[117514]: 2025-10-08 19:10:55.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.389 2 DEBUG nova.compute.manager [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.390 2 DEBUG oslo_concurrency.lockutils [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.391 2 DEBUG oslo_concurrency.lockutils [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.392 2 DEBUG oslo_concurrency.lockutils [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.392 2 DEBUG nova.compute.manager [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] No waiting events found dispatching network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:10:56 compute-0 nova_compute[117514]: 2025-10-08 19:10:56.393 2 WARNING nova.compute.manager [req-a63cbb7d-410e-48eb-8159-f18f9d9a1947 req-82204da9-6e7f-4ddd-8e5a-203d270a81a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received unexpected event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:10:57 compute-0 nova_compute[117514]: 2025-10-08 19:10:57.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:10:58 compute-0 podman[147920]: 2025-10-08 19:10:58.640654616 +0000 UTC m=+0.057319670 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 19:10:58 compute-0 podman[147919]: 2025-10-08 19:10:58.667971687 +0000 UTC m=+0.080591444 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1755695350, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 19:11:00 compute-0 nova_compute[117514]: 2025-10-08 19:11:00.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:01 compute-0 podman[147959]: 2025-10-08 19:11:01.663488804 +0000 UTC m=+0.078641677 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:11:02 compute-0 nova_compute[117514]: 2025-10-08 19:11:02.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:03 compute-0 nova_compute[117514]: 2025-10-08 19:11:03.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:04 compute-0 podman[147993]: 2025-10-08 19:11:04.68595602 +0000 UTC m=+0.091934061 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 19:11:04 compute-0 podman[147996]: 2025-10-08 19:11:04.705425074 +0000 UTC m=+0.100214961 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  8 19:11:04 compute-0 podman[147995]: 2025-10-08 19:11:04.716515905 +0000 UTC m=+0.123270869 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Oct  8 19:11:05 compute-0 ovn_controller[19759]: 2025-10-08T19:11:05Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:be:8a 10.100.0.24
Oct  8 19:11:05 compute-0 ovn_controller[19759]: 2025-10-08T19:11:05Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:be:8a 10.100.0.24
Oct  8 19:11:05 compute-0 nova_compute[117514]: 2025-10-08 19:11:05.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:06 compute-0 nova_compute[117514]: 2025-10-08 19:11:06.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:06 compute-0 nova_compute[117514]: 2025-10-08 19:11:06.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.732 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.732 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.733 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.759 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.759 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.760 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.760 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.845 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.935 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:11:07 compute-0 nova_compute[117514]: 2025-10-08 19:11:07.936 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.029 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.039 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.132 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.134 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.207 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.469 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.470 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5758MB free_disk=73.35517883300781GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.471 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.471 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.684 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.685 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 6af51230-93a7-45ef-9a1e-c47302f43bcf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.686 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.686 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.737 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing inventories for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.808 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating ProviderTree inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.809 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.824 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing aggregate associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.848 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing trait associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.908 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.924 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.956 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:11:08 compute-0 nova_compute[117514]: 2025-10-08 19:11:08.956 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.485s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.741 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.741 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:11:09 compute-0 nova_compute[117514]: 2025-10-08 19:11:09.742 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.111 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.112 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.113 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.113 2 DEBUG nova.objects.instance [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:11:10 compute-0 nova_compute[117514]: 2025-10-08 19:11:10.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.643 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.666 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.666 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.667 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.668 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.669 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.669 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:11:11 compute-0 nova_compute[117514]: 2025-10-08 19:11:11.670 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:12 compute-0 nova_compute[117514]: 2025-10-08 19:11:12.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:12 compute-0 nova_compute[117514]: 2025-10-08 19:11:12.728 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:12 compute-0 nova_compute[117514]: 2025-10-08 19:11:12.729 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 19:11:12 compute-0 nova_compute[117514]: 2025-10-08 19:11:12.750 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.515 2 DEBUG nova.compute.manager [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.516 2 DEBUG nova.compute.manager [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-ea81e5cb-74ba-43da-a780-3f1f699fa0d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.516 2 DEBUG oslo_concurrency.lockutils [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.517 2 DEBUG oslo_concurrency.lockutils [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:11:13 compute-0 nova_compute[117514]: 2025-10-08 19:11:13.517 2 DEBUG nova.network.neutron [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:11:13 compute-0 podman[148072]: 2025-10-08 19:11:13.666324393 +0000 UTC m=+0.076061272 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:11:14 compute-0 nova_compute[117514]: 2025-10-08 19:11:14.362 2 DEBUG nova.network.neutron [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port ea81e5cb-74ba-43da-a780-3f1f699fa0d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:11:14 compute-0 nova_compute[117514]: 2025-10-08 19:11:14.363 2 DEBUG nova.network.neutron [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:11:14 compute-0 nova_compute[117514]: 2025-10-08 19:11:14.379 2 DEBUG oslo_concurrency.lockutils [req-170bf41f-a65f-4b98-b4db-2f2e975c59bf req-6cc8365b-dc88-4f15-b73c-4cd5c87feb71 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:11:15 compute-0 nova_compute[117514]: 2025-10-08 19:11:15.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:17 compute-0 nova_compute[117514]: 2025-10-08 19:11:17.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.129 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.150 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Triggering sync for uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.151 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Triggering sync for uuid 6af51230-93a7-45ef-9a1e-c47302f43bcf _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.151 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.152 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.152 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.153 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.229 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:11:18 compute-0 nova_compute[117514]: 2025-10-08 19:11:18.231 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:11:21 compute-0 nova_compute[117514]: 2025-10-08 19:11:21.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:22 compute-0 nova_compute[117514]: 2025-10-08 19:11:22.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:23 compute-0 podman[148096]: 2025-10-08 19:11:23.650455211 +0000 UTC m=+0.074246939 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_id=edpm, tcib_managed=true)
Oct  8 19:11:26 compute-0 nova_compute[117514]: 2025-10-08 19:11:26.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:27 compute-0 nova_compute[117514]: 2025-10-08 19:11:27.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:29 compute-0 podman[148118]: 2025-10-08 19:11:29.637572011 +0000 UTC m=+0.061353377 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Oct  8 19:11:29 compute-0 podman[148117]: 2025-10-08 19:11:29.652325578 +0000 UTC m=+0.069917215 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter)
Oct  8 19:11:31 compute-0 nova_compute[117514]: 2025-10-08 19:11:31.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:32 compute-0 nova_compute[117514]: 2025-10-08 19:11:32.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:32 compute-0 podman[148158]: 2025-10-08 19:11:32.651551792 +0000 UTC m=+0.076879844 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:11:35 compute-0 podman[148184]: 2025-10-08 19:11:35.649262663 +0000 UTC m=+0.069383789 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 19:11:35 compute-0 podman[148182]: 2025-10-08 19:11:35.679793957 +0000 UTC m=+0.098211943 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, tcib_managed=true)
Oct  8 19:11:35 compute-0 podman[148183]: 2025-10-08 19:11:35.709845866 +0000 UTC m=+0.125472532 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 19:11:36 compute-0 nova_compute[117514]: 2025-10-08 19:11:36.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:37 compute-0 nova_compute[117514]: 2025-10-08 19:11:37.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:41 compute-0 nova_compute[117514]: 2025-10-08 19:11:41.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:42 compute-0 nova_compute[117514]: 2025-10-08 19:11:42.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:43 compute-0 ovn_controller[19759]: 2025-10-08T19:11:43Z|00098|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Oct  8 19:11:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:44.232 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:44.233 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:11:44 compute-0 podman[148242]: 2025-10-08 19:11:44.668697295 +0000 UTC m=+0.083457646 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:11:46 compute-0 nova_compute[117514]: 2025-10-08 19:11:46.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:47 compute-0 nova_compute[117514]: 2025-10-08 19:11:47.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:51 compute-0 nova_compute[117514]: 2025-10-08 19:11:51.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:52 compute-0 nova_compute[117514]: 2025-10-08 19:11:52.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:54 compute-0 podman[148266]: 2025-10-08 19:11:54.666222163 +0000 UTC m=+0.090221532 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:11:56 compute-0 nova_compute[117514]: 2025-10-08 19:11:56.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.976 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.977 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.977 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.978 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.978 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.981 2 INFO nova.compute.manager [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Terminating instance#033[00m
Oct  8 19:11:57 compute-0 nova_compute[117514]: 2025-10-08 19:11:57.983 2 DEBUG nova.compute.manager [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:11:58 compute-0 kernel: tap062b16e8-3c (unregistering): left promiscuous mode
Oct  8 19:11:58 compute-0 NetworkManager[1035]: <info>  [1759950718.0188] device (tap062b16e8-3c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:58 compute-0 ovn_controller[19759]: 2025-10-08T19:11:58Z|00099|binding|INFO|Releasing lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 from this chassis (sb_readonly=0)
Oct  8 19:11:58 compute-0 ovn_controller[19759]: 2025-10-08T19:11:58Z|00100|binding|INFO|Setting lport 062b16e8-3c3b-4520-b0f8-536d588db2f5 down in Southbound
Oct  8 19:11:58 compute-0 ovn_controller[19759]: 2025-10-08T19:11:58Z|00101|binding|INFO|Removing iface tap062b16e8-3c ovn-installed in OVS
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.045 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:be:8a 10.100.0.24'], port_security=['fa:16:3e:49:be:8a 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': '6af51230-93a7-45ef-9a1e-c47302f43bcf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bc67bd-dd21-4701-b445-33eb52179602', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a7ebd8cf-2e32-494a-bac7-d2c7c2ffc36a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=325bd26c-56bb-4683-8b62-92cc8f266207, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=062b16e8-3c3b-4520-b0f8-536d588db2f5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.047 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 062b16e8-3c3b-4520-b0f8-536d588db2f5 in datapath e3bc67bd-dd21-4701-b445-33eb52179602 unbound from our chassis#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.049 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e3bc67bd-dd21-4701-b445-33eb52179602#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.071 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3585de-8852-4a28-9a58-bb432951b116]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.099 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[b0ffdab6-c181-481f-8975-4d4809daa520]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.101 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[78b97c2d-0ea7-44e6-b780-741c8661471d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:11:58 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Oct  8 19:11:58 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 14.599s CPU time.
Oct  8 19:11:58 compute-0 systemd-machined[77568]: Machine qemu-7-instance-00000007 terminated.
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.125 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[5287ee8a-0659-4662-a2fb-94117d2cbb60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.141 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d3251fe8-f580-46b7-98d5-183049219694]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape3bc67bd-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:4f:d1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 15, 'tx_packets': 7, 'rx_bytes': 1222, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 15, 'tx_packets': 7, 'rx_bytes': 1222, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 36], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130101, 'reachable_time': 29029, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 10, 'inoctets': 872, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 10, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 872, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 10, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 148299, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.155 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5fb39fda-bbb7-43f4-bf7a-3c8ccb532151]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tape3bc67bd-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130118, 'tstamp': 130118}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 148300, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tape3bc67bd-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 130123, 'tstamp': 130123}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 148300, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.156 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bc67bd-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.158 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.162 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape3bc67bd-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.162 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.163 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape3bc67bd-d0, col_values=(('external_ids', {'iface-id': 'd935682a-e42a-4970-b54c-b54c616cf798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.163 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.239 2 INFO nova.virt.libvirt.driver [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Instance destroyed successfully.#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.239 2 DEBUG nova.objects.instance [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 6af51230-93a7-45ef-9a1e-c47302f43bcf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.253 2 DEBUG nova.virt.libvirt.vif [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:10:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1744393953',display_name='tempest-TestNetworkBasicOps-server-1744393953',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1744393953',id=7,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ/nGXxhsdJmWHabE3HFa5+3pmT1eGAFwd96u9XHC+whrqyLo5hIQAYJiUfXapQHjQsYnRIxe45Y0OXwPlQza5nnuSeUdl81Vlbahpy7snJ2RnOlPvASQfobelq2pqhHKA==',key_name='tempest-TestNetworkBasicOps-575443871',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:10:54Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-ke980fn8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:10:55Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=6af51230-93a7-45ef-9a1e-c47302f43bcf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.253 2 DEBUG nova.network.os_vif_util [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "address": "fa:16:3e:49:be:8a", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap062b16e8-3c", "ovs_interfaceid": "062b16e8-3c3b-4520-b0f8-536d588db2f5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.254 2 DEBUG nova.network.os_vif_util [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.254 2 DEBUG os_vif [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.256 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap062b16e8-3c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.261 2 INFO os_vif [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:be:8a,bridge_name='br-int',has_traffic_filtering=True,id=062b16e8-3c3b-4520-b0f8-536d588db2f5,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap062b16e8-3c')#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.261 2 INFO nova.virt.libvirt.driver [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Deleting instance files /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf_del#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.262 2 INFO nova.virt.libvirt.driver [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Deletion of /var/lib/nova/instances/6af51230-93a7-45ef-9a1e-c47302f43bcf_del complete#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.323 2 INFO nova.compute.manager [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.324 2 DEBUG oslo.service.loopingcall [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.324 2 DEBUG nova.compute.manager [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.325 2 DEBUG nova.network.neutron [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.354 2 DEBUG nova.compute.manager [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-unplugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.354 2 DEBUG oslo_concurrency.lockutils [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.355 2 DEBUG oslo_concurrency.lockutils [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.355 2 DEBUG oslo_concurrency.lockutils [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.355 2 DEBUG nova.compute.manager [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] No waiting events found dispatching network-vif-unplugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.356 2 DEBUG nova.compute.manager [req-c88323bd-afff-41b0-8f1e-449dd222b828 req-5519c56f-b968-4ae8-9d5e-83ad54848912 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-unplugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.434 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:11:58 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:11:58.435 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.812 2 DEBUG nova.network.neutron [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.833 2 INFO nova.compute.manager [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Took 0.51 seconds to deallocate network for instance.#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.884 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.885 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.889 2 DEBUG nova.compute.manager [req-144479d1-b1bc-429c-835c-fdbabf1c1230 req-24cdfdd0-ce63-4712-8288-cff53fd9846e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-deleted-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.972 2 DEBUG nova.compute.provider_tree [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:11:58 compute-0 nova_compute[117514]: 2025-10-08 19:11:58.989 2 DEBUG nova.scheduler.client.report [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:11:59 compute-0 nova_compute[117514]: 2025-10-08 19:11:59.024 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:11:59 compute-0 nova_compute[117514]: 2025-10-08 19:11:59.061 2 INFO nova.scheduler.client.report [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 6af51230-93a7-45ef-9a1e-c47302f43bcf#033[00m
Oct  8 19:11:59 compute-0 nova_compute[117514]: 2025-10-08 19:11:59.159 2 DEBUG oslo_concurrency.lockutils [None req-8ba7d0d6-2bc3-4a73-9d4d-cf1898b3aebc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.439 2 DEBUG nova.compute.manager [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.440 2 DEBUG oslo_concurrency.lockutils [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.441 2 DEBUG oslo_concurrency.lockutils [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.441 2 DEBUG oslo_concurrency.lockutils [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "6af51230-93a7-45ef-9a1e-c47302f43bcf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.441 2 DEBUG nova.compute.manager [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] No waiting events found dispatching network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.442 2 WARNING nova.compute.manager [req-4843d87d-c925-4a63-ab00-749832b02af7 req-0e9ea112-3ea9-458a-8679-775ed33b8318 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Received unexpected event network-vif-plugged-062b16e8-3c3b-4520-b0f8-536d588db2f5 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.445 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-ea81e5cb-74ba-43da-a780-3f1f699fa0d6" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.445 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-ea81e5cb-74ba-43da-a780-3f1f699fa0d6" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.472 2 DEBUG nova.objects.instance [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'flavor' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.494 2 DEBUG nova.virt.libvirt.vif [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.494 2 DEBUG nova.network.os_vif_util [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.495 2 DEBUG nova.network.os_vif_util [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.500 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.502 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.504 2 DEBUG nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Attempting to detach device tapea81e5cb-74 from instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.505 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] detach device xml: <interface type="ethernet">
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <mac address="fa:16:3e:11:ac:ba"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <model type="virtio"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <mtu size="1442"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <target dev="tapea81e5cb-74"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]: </interface>
Oct  8 19:12:00 compute-0 nova_compute[117514]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.514 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.518 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface>not found in domain: <domain type='kvm' id='6'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <name>instance-00000006</name>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <uuid>783f8889-2bc8-4641-bdb9-95ee4226a2fd</uuid>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:10:30</nova:creationTime>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:port uuid="ea81e5cb-74ba-43da-a780-3f1f699fa0d6">
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:12:00 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <memory unit='KiB'>131072</memory>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <vcpu placement='static'>1</vcpu>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <resource>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <partition>/machine</partition>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </resource>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <sysinfo type='smbios'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <system>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='manufacturer'>RDO</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='product'>OpenStack Compute</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='serial'>783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='uuid'>783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='family'>Virtual Machine</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </system>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <os>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <boot dev='hd'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <smbios mode='sysinfo'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </os>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <features>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <vmcoreinfo state='on'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </features>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <cpu mode='custom' match='exact' check='full'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <vendor>AMD</vendor>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='x2apic'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc-deadline'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='hypervisor'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc_adjust'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='spec-ctrl'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='stibp'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='arch-capabilities'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='ssbd'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='cmp_legacy'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='overflow-recov'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='succor'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='ibrs'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='amd-ssbd'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='virt-ssbd'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='lbrv'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='tsc-scale'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='vmcb-clean'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='flushbyasid'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='pause-filter'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='pfthreshold'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='svme-addr-chk'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='rdctl-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='mds-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='gds-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='rfds-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='xsaves'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='svm'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='topoext'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='npt'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='nrip-save'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <clock offset='utc'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <timer name='pit' tickpolicy='delay'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <timer name='hpet' present='no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <on_poweroff>destroy</on_poweroff>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <on_reboot>restart</on_reboot>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <on_crash>destroy</on_crash>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <disk type='file' device='disk'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk' index='2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <backingStore type='file' index='3'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:        <format type='raw'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:        <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:        <backingStore/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      </backingStore>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target dev='vda' bus='virtio'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='virtio-disk0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <disk type='file' device='cdrom'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <driver name='qemu' type='raw' cache='none'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config' index='1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <backingStore/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target dev='sda' bus='sata'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <readonly/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='sata0-0-0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='0' model='pcie-root'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pcie.0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='1' port='0x10'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='2' port='0x11'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='3' port='0x12'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.3'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='4' port='0x13'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.4'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='5' port='0x14'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.5'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='6' port='0x15'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.6'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='7' port='0x16'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.7'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='8' port='0x17'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.8'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='9' port='0x18'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.9'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='10' port='0x19'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.10'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='11' port='0x1a'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.11'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='12' port='0x1b'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.12'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='13' port='0x1c'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.13'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='14' port='0x1d'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.14'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='15' port='0x1e'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.15'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='16' port='0x1f'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.16'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='17' port='0x20'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.17'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='18' port='0x21'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.18'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='19' port='0x22'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.19'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='20' port='0x23'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.20'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='21' port='0x24'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.21'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='22' port='0x25'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.22'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='23' port='0x26'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.23'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='24' port='0x27'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.24'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='25' port='0x28'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.25'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-pci-bridge'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.26'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='usb'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='sata' index='0'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='ide'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <interface type='ethernet'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <mac address='fa:16:3e:4e:85:2e'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target dev='tapbfb32e9e-52'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model type='virtio'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <driver name='vhost' rx_queue_size='512'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <mtu size='1442'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='net0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <interface type='ethernet'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <mac address='fa:16:3e:11:ac:ba'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target dev='tapea81e5cb-74'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model type='virtio'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <driver name='vhost' rx_queue_size='512'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <mtu size='1442'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='net1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <serial type='pty'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log' append='off'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target type='isa-serial' port='0'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:        <model name='isa-serial'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      </target>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <console type='pty' tty='/dev/pts/0'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log' append='off'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target type='serial' port='0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </console>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <input type='tablet' bus='usb'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='input0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='usb' bus='0' port='1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </input>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <input type='mouse' bus='ps2'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='input1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </input>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <input type='keyboard' bus='ps2'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='input2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </input>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <listen type='address' address='::0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <audio id='1' type='none'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <video>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model type='virtio' heads='1' primary='yes'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='video0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </video>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <watchdog model='itco' action='reset'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='watchdog0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </watchdog>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <memballoon model='virtio'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <stats period='10'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='balloon0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <rng model='virtio'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <backend model='random'>/dev/urandom</backend>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='rng0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <label>system_u:system_r:svirt_t:s0:c55,c685</label>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c55,c685</imagelabel>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <label>+107:+107</label>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <imagelabel>+107:+107</imagelabel>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:12:00 compute-0 nova_compute[117514]: </domain>
Oct  8 19:12:00 compute-0 nova_compute[117514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.518 2 INFO nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully detached device tapea81e5cb-74 from instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd from the persistent domain config.#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.518 2 DEBUG nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] (1/8): Attempting to detach device tapea81e5cb-74 with device alias net1 from instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.518 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] detach device xml: <interface type="ethernet">
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <mac address="fa:16:3e:11:ac:ba"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <model type="virtio"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <mtu size="1442"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <target dev="tapea81e5cb-74"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]: </interface>
Oct  8 19:12:00 compute-0 nova_compute[117514]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  8 19:12:00 compute-0 kernel: tapea81e5cb-74 (unregistering): left promiscuous mode
Oct  8 19:12:00 compute-0 NetworkManager[1035]: <info>  [1759950720.6153] device (tapea81e5cb-74): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:12:00 compute-0 ovn_controller[19759]: 2025-10-08T19:12:00Z|00102|binding|INFO|Releasing lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 from this chassis (sb_readonly=0)
Oct  8 19:12:00 compute-0 ovn_controller[19759]: 2025-10-08T19:12:00Z|00103|binding|INFO|Setting lport ea81e5cb-74ba-43da-a780-3f1f699fa0d6 down in Southbound
Oct  8 19:12:00 compute-0 ovn_controller[19759]: 2025-10-08T19:12:00Z|00104|binding|INFO|Removing iface tapea81e5cb-74 ovn-installed in OVS
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.663 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.665 2 DEBUG nova.virt.libvirt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Received event <DeviceRemovedEvent: 1759950720.6636052, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.667 2 DEBUG nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Start waiting for the detach event from libvirt for device tapea81e5cb-74 with device alias net1 for instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.668 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  8 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.668 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:ac:ba 10.100.0.22', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e3bc67bd-dd21-4701-b445-33eb52179602', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=325bd26c-56bb-4683-8b62-92cc8f266207, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=ea81e5cb-74ba-43da-a780-3f1f699fa0d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.669 28643 INFO neutron.agent.ovn.metadata.agent [-] Port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 in datapath e3bc67bd-dd21-4701-b445-33eb52179602 unbound from our chassis#033[00m
Oct  8 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.670 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e3bc67bd-dd21-4701-b445-33eb52179602, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.671 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cce2bf17-6b30-436e-a3dc-3f302d6d4a0d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:00.672 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 namespace which is not needed anymore#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.678 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:11:ac:ba"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapea81e5cb-74"/></interface>not found in domain: <domain type='kvm' id='6'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <name>instance-00000006</name>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <uuid>783f8889-2bc8-4641-bdb9-95ee4226a2fd</uuid>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:10:30</nova:creationTime>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:port uuid="ea81e5cb-74ba-43da-a780-3f1f699fa0d6">
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:12:00 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <memory unit='KiB'>131072</memory>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <vcpu placement='static'>1</vcpu>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <resource>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <partition>/machine</partition>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </resource>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <sysinfo type='smbios'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <system>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='manufacturer'>RDO</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='product'>OpenStack Compute</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='serial'>783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='uuid'>783f8889-2bc8-4641-bdb9-95ee4226a2fd</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <entry name='family'>Virtual Machine</entry>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </system>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <os>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <boot dev='hd'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <smbios mode='sysinfo'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </os>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <features>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <vmcoreinfo state='on'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </features>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <cpu mode='custom' match='exact' check='full'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <model fallback='forbid'>EPYC-Rome</model>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <vendor>AMD</vendor>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='x2apic'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc-deadline'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='hypervisor'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='tsc_adjust'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='spec-ctrl'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='stibp'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='arch-capabilities'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='ssbd'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='cmp_legacy'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='overflow-recov'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='succor'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='ibrs'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='amd-ssbd'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='virt-ssbd'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='lbrv'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='tsc-scale'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='vmcb-clean'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='flushbyasid'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='pause-filter'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='pfthreshold'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='svme-addr-chk'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='lfence-always-serializing'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='rdctl-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='mds-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='pschange-mc-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='gds-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='rfds-no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='xsaves'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='svm'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='require' name='topoext'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='npt'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <feature policy='disable' name='nrip-save'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <clock offset='utc'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <timer name='pit' tickpolicy='delay'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <timer name='hpet' present='no'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <on_poweroff>destroy</on_poweroff>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <on_reboot>restart</on_reboot>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <on_crash>destroy</on_crash>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <disk type='file' device='disk'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk' index='2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <backingStore type='file' index='3'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:        <format type='raw'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:        <source file='/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:        <backingStore/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      </backingStore>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target dev='vda' bus='virtio'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='virtio-disk0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <disk type='file' device='cdrom'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <driver name='qemu' type='raw' cache='none'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <source file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/disk.config' index='1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <backingStore/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target dev='sda' bus='sata'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <readonly/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='sata0-0-0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='0' model='pcie-root'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pcie.0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='1' port='0x10'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='2' port='0x11'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='3' port='0x12'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.3'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='4' port='0x13'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.4'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='5' port='0x14'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.5'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='6' port='0x15'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.6'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='7' port='0x16'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.7'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='8' port='0x17'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.8'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='9' port='0x18'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.9'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='10' port='0x19'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.10'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='11' port='0x1a'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.11'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='12' port='0x1b'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.12'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='13' port='0x1c'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.13'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='14' port='0x1d'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.14'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='15' port='0x1e'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.15'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='16' port='0x1f'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.16'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='17' port='0x20'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.17'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='18' port='0x21'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.18'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='19' port='0x22'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.19'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='20' port='0x23'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.20'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='21' port='0x24'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.21'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='22' port='0x25'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.22'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='23' port='0x26'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.23'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='24' port='0x27'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.24'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-root-port'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target chassis='25' port='0x28'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.25'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model name='pcie-pci-bridge'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='pci.26'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='usb'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <controller type='sata' index='0'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='ide'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </controller>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <interface type='ethernet'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <mac address='fa:16:3e:4e:85:2e'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target dev='tapbfb32e9e-52'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model type='virtio'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <driver name='vhost' rx_queue_size='512'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <mtu size='1442'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='net0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <serial type='pty'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log' append='off'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target type='isa-serial' port='0'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:        <model name='isa-serial'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      </target>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <console type='pty' tty='/dev/pts/0'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <source path='/dev/pts/0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <log file='/var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd/console.log' append='off'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <target type='serial' port='0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='serial0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </console>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <input type='tablet' bus='usb'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='input0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='usb' bus='0' port='1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </input>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <input type='mouse' bus='ps2'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='input1'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </input>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <input type='keyboard' bus='ps2'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='input2'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </input>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <listen type='address' address='::0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </graphics>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <audio id='1' type='none'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <video>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <model type='virtio' heads='1' primary='yes'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='video0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </video>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <watchdog model='itco' action='reset'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='watchdog0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </watchdog>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <memballoon model='virtio'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <stats period='10'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='balloon0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <rng model='virtio'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <backend model='random'>/dev/urandom</backend>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <alias name='rng0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <label>system_u:system_r:svirt_t:s0:c55,c685</label>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c55,c685</imagelabel>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <label>+107:+107</label>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <imagelabel>+107:+107</imagelabel>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </seclabel>
Oct  8 19:12:00 compute-0 nova_compute[117514]: </domain>
Oct  8 19:12:00 compute-0 nova_compute[117514]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.678 2 INFO nova.virt.libvirt.driver [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully detached device tapea81e5cb-74 from instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd from the live domain config.#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.680 2 DEBUG nova.virt.libvirt.vif [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.681 2 DEBUG nova.network.os_vif_util [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "address": "fa:16:3e:11:ac:ba", "network": {"id": "e3bc67bd-dd21-4701-b445-33eb52179602", "bridge": "br-int", "label": "tempest-network-smoke--1834937033", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea81e5cb-74", "ovs_interfaceid": "ea81e5cb-74ba-43da-a780-3f1f699fa0d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.682 2 DEBUG nova.network.os_vif_util [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.683 2 DEBUG os_vif [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.685 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea81e5cb-74, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.690 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.695 2 INFO os_vif [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:11:ac:ba,bridge_name='br-int',has_traffic_filtering=True,id=ea81e5cb-74ba-43da-a780-3f1f699fa0d6,network=Network(e3bc67bd-dd21-4701-b445-33eb52179602),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea81e5cb-74')#033[00m
Oct  8 19:12:00 compute-0 nova_compute[117514]: 2025-10-08 19:12:00.696 2 DEBUG nova.virt.libvirt.guest [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:name>tempest-TestNetworkBasicOps-server-1641480242</nova:name>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:creationTime>2025-10-08 19:12:00</nova:creationTime>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:flavor name="m1.nano">
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:memory>128</nova:memory>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:disk>1</nova:disk>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:swap>0</nova:swap>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:vcpus>1</nova:vcpus>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:flavor>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:owner>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:owner>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  <nova:ports>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    <nova:port uuid="bfb32e9e-52b6-4043-b9a6-129d11fa2814">
Oct  8 19:12:00 compute-0 nova_compute[117514]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  8 19:12:00 compute-0 nova_compute[117514]:    </nova:port>
Oct  8 19:12:00 compute-0 nova_compute[117514]:  </nova:ports>
Oct  8 19:12:00 compute-0 nova_compute[117514]: </nova:instance>
Oct  8 19:12:00 compute-0 nova_compute[117514]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  8 19:12:00 compute-0 podman[148319]: 2025-10-08 19:12:00.719649533 +0000 UTC m=+0.129808978 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Oct  8 19:12:00 compute-0 podman[148320]: 2025-10-08 19:12:00.739531348 +0000 UTC m=+0.147768647 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [NOTICE]   (147733) : haproxy version is 2.8.14-c23fe91
Oct  8 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [NOTICE]   (147733) : path to executable is /usr/sbin/haproxy
Oct  8 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [WARNING]  (147733) : Exiting Master process...
Oct  8 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [ALERT]    (147733) : Current worker (147741) exited with code 143 (Terminated)
Oct  8 19:12:00 compute-0 neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602[147719]: [WARNING]  (147733) : All workers exited. Exiting... (0)
Oct  8 19:12:00 compute-0 systemd[1]: libpod-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40.scope: Deactivated successfully.
Oct  8 19:12:00 compute-0 conmon[147719]: conmon 0263a6d21769e3c38d37 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40.scope/container/memory.events
Oct  8 19:12:00 compute-0 podman[148380]: 2025-10-08 19:12:00.945383306 +0000 UTC m=+0.152696930 container died 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 19:12:01 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40-userdata-shm.mount: Deactivated successfully.
Oct  8 19:12:01 compute-0 systemd[1]: var-lib-containers-storage-overlay-dbb48eabd75c06324d43161ee30ba9aa731d2058597c747932479a573242f471-merged.mount: Deactivated successfully.
Oct  8 19:12:01 compute-0 podman[148380]: 2025-10-08 19:12:01.110181054 +0000 UTC m=+0.317494648 container cleanup 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:12:01 compute-0 systemd[1]: libpod-conmon-0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40.scope: Deactivated successfully.
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:01 compute-0 podman[148415]: 2025-10-08 19:12:01.174238368 +0000 UTC m=+0.045091556 container remove 0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.183 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aae42e50-311f-48fa-a3c9-1c9e15485cb7]: (4, ('Wed Oct  8 07:12:00 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 (0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40)\n0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40\nWed Oct  8 07:12:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 (0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40)\n0263a6d21769e3c38d37f8ef90b039ce1a54c225bd0e154d3b91c06548a70e40\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.185 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[12839b11-3b44-4157-b1e9-ec898d017753]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.186 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape3bc67bd-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:01 compute-0 kernel: tape3bc67bd-d0: left promiscuous mode
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.193 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[56735f69-be01-4d74-92f1-1dfdf4e200e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.237 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c809fb2b-e047-4e16-a3df-daab626a0cf6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.239 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[92e5f1c3-8724-4f92-ae62-b9a4597d7ef2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.257 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e10dbd20-ba9f-4bf0-91f5-c24603f812ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 130089, 'reachable_time': 29829, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 148429, 'error': None, 'target': 'ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.260 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e3bc67bd-dd21-4701-b445-33eb52179602 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:12:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:01.260 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[94f37eed-b18c-4c7b-89b1-649a842c59c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:01 compute-0 systemd[1]: run-netns-ovnmeta\x2de3bc67bd\x2ddd21\x2d4701\x2db445\x2d33eb52179602.mount: Deactivated successfully.
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.289 2 DEBUG nova.compute.manager [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-unplugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.289 2 DEBUG oslo_concurrency.lockutils [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.290 2 DEBUG oslo_concurrency.lockutils [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.290 2 DEBUG oslo_concurrency.lockutils [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.291 2 DEBUG nova.compute.manager [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-unplugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.291 2 WARNING nova.compute.manager [req-6a1e267f-3daf-4b78-a19b-302bd76702de req-e5448483-8e82-4711-abf2-a455b1ed91ca bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-unplugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.565 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.566 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:12:01 compute-0 nova_compute[117514]: 2025-10-08 19:12:01.566 2 DEBUG nova.network.neutron [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:12:02 compute-0 ovn_controller[19759]: 2025-10-08T19:12:02Z|00105|binding|INFO|Releasing lport ef1b5170-2d11-4e01-98e4-310f59c22ecd from this chassis (sb_readonly=0)
Oct  8 19:12:02 compute-0 nova_compute[117514]: 2025-10-08 19:12:02.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.157 2 INFO nova.network.neutron [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Port ea81e5cb-74ba-43da-a780-3f1f699fa0d6 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.158 2 DEBUG nova.network.neutron [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.177 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.201 2 DEBUG oslo_concurrency.lockutils [None req-c33bc6c1-b3aa-4d5c-9bd2-ec101e8e429e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "interface-783f8889-2bc8-4641-bdb9-95ee4226a2fd-ea81e5cb-74ba-43da-a780-3f1f699fa0d6" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.392 2 DEBUG nova.compute.manager [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.393 2 DEBUG oslo_concurrency.lockutils [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.393 2 DEBUG oslo_concurrency.lockutils [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.394 2 DEBUG oslo_concurrency.lockutils [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.394 2 DEBUG nova.compute.manager [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.395 2 WARNING nova.compute.manager [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.395 2 DEBUG nova.compute.manager [req-c9052887-e383-4b55-822a-7f514a42851a req-74fc0c1e-d429-4ca4-bc1e-d2c066760e93 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-deleted-ea81e5cb-74ba-43da-a780-3f1f699fa0d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.562 2 DEBUG nova.compute.manager [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.563 2 DEBUG nova.compute.manager [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing instance network info cache due to event network-changed-bfb32e9e-52b6-4043-b9a6-129d11fa2814. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.563 2 DEBUG oslo_concurrency.lockutils [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.563 2 DEBUG oslo_concurrency.lockutils [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.563 2 DEBUG nova.network.neutron [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Refreshing network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.649 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.650 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.650 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.651 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.651 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.653 2 INFO nova.compute.manager [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Terminating instance#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.655 2 DEBUG nova.compute.manager [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:12:03 compute-0 kernel: tapbfb32e9e-52 (unregistering): left promiscuous mode
Oct  8 19:12:03 compute-0 NetworkManager[1035]: <info>  [1759950723.6837] device (tapbfb32e9e-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 ovn_controller[19759]: 2025-10-08T19:12:03Z|00106|binding|INFO|Releasing lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 from this chassis (sb_readonly=0)
Oct  8 19:12:03 compute-0 ovn_controller[19759]: 2025-10-08T19:12:03Z|00107|binding|INFO|Setting lport bfb32e9e-52b6-4043-b9a6-129d11fa2814 down in Southbound
Oct  8 19:12:03 compute-0 ovn_controller[19759]: 2025-10-08T19:12:03Z|00108|binding|INFO|Removing iface tapbfb32e9e-52 ovn-installed in OVS
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.707 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4e:85:2e 10.100.0.14'], port_security=['fa:16:3e:4e:85:2e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '783f8889-2bc8-4641-bdb9-95ee4226a2fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e1f96720-345d-4fd7-8b5f-d68f6fe81454', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cd3b59ed-5967-491c-a3b5-d0ba2b165b15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=bfb32e9e-52b6-4043-b9a6-129d11fa2814) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.709 28643 INFO neutron.agent.ovn.metadata.agent [-] Port bfb32e9e-52b6-4043-b9a6-129d11fa2814 in datapath 0d073e98-c9f2-4b90-8237-84ff2fa99090 unbound from our chassis#033[00m
Oct  8 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.711 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0d073e98-c9f2-4b90-8237-84ff2fa99090, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.712 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f4e61a3f-647f-4223-b0a0-be0eb8ece5b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:03.713 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 namespace which is not needed anymore#033[00m
Oct  8 19:12:03 compute-0 podman[148434]: 2025-10-08 19:12:03.722363078 +0000 UTC m=+0.128600853 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.741 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:03 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Oct  8 19:12:03 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 19.686s CPU time.
Oct  8 19:12:03 compute-0 systemd-machined[77568]: Machine qemu-6-instance-00000006 terminated.
Oct  8 19:12:03 compute-0 NetworkManager[1035]: <info>  [1759950723.8768] manager: (tapbfb32e9e-52): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.921 2 INFO nova.virt.libvirt.driver [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Instance destroyed successfully.#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.921 2 DEBUG nova.objects.instance [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 783f8889-2bc8-4641-bdb9-95ee4226a2fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.938 2 DEBUG nova.virt.libvirt.vif [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:09:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1641480242',display_name='tempest-TestNetworkBasicOps-server-1641480242',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1641480242',id=6,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPSyEE+QeB2DOtd7xoaY+J9mVl+DzPE43UDhso7eEGO9aQXs3wmj/YcqHfJ97lRUVFOa3dbwNiIUyunSI3DyzjQf/v6cjCZ2KkxRD0GJnQ0zRM5omnXaZRnz3Bq5VONa9g==',key_name='tempest-TestNetworkBasicOps-1535027603',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:09:58Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-pbt1zket',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:09:58Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=783f8889-2bc8-4641-bdb9-95ee4226a2fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.939 2 DEBUG nova.network.os_vif_util [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.939 2 DEBUG nova.network.os_vif_util [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.940 2 DEBUG os_vif [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.943 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfb32e9e-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.951 2 INFO os_vif [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4e:85:2e,bridge_name='br-int',has_traffic_filtering=True,id=bfb32e9e-52b6-4043-b9a6-129d11fa2814,network=Network(0d073e98-c9f2-4b90-8237-84ff2fa99090),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfb32e9e-52')#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.952 2 INFO nova.virt.libvirt.driver [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Deleting instance files /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd_del#033[00m
Oct  8 19:12:03 compute-0 nova_compute[117514]: 2025-10-08 19:12:03.953 2 INFO nova.virt.libvirt.driver [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Deletion of /var/lib/nova/instances/783f8889-2bc8-4641-bdb9-95ee4226a2fd_del complete#033[00m
Oct  8 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [NOTICE]   (147423) : haproxy version is 2.8.14-c23fe91
Oct  8 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [NOTICE]   (147423) : path to executable is /usr/sbin/haproxy
Oct  8 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [WARNING]  (147423) : Exiting Master process...
Oct  8 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [WARNING]  (147423) : Exiting Master process...
Oct  8 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [ALERT]    (147423) : Current worker (147425) exited with code 143 (Terminated)
Oct  8 19:12:03 compute-0 neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090[147419]: [WARNING]  (147423) : All workers exited. Exiting... (0)
Oct  8 19:12:03 compute-0 systemd[1]: libpod-a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc.scope: Deactivated successfully.
Oct  8 19:12:03 compute-0 podman[148484]: 2025-10-08 19:12:03.972838737 +0000 UTC m=+0.117522042 container died a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.012 2 INFO nova.compute.manager [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.013 2 DEBUG oslo.service.loopingcall [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.013 2 DEBUG nova.compute.manager [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.013 2 DEBUG nova.network.neutron [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:12:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc-userdata-shm.mount: Deactivated successfully.
Oct  8 19:12:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-3a1feba6530d5a46f34a3cb37ffae2c111c4760047000322a985a4db99d10005-merged.mount: Deactivated successfully.
Oct  8 19:12:04 compute-0 podman[148484]: 2025-10-08 19:12:04.057530407 +0000 UTC m=+0.202213722 container cleanup a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 19:12:04 compute-0 systemd[1]: libpod-conmon-a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc.scope: Deactivated successfully.
Oct  8 19:12:04 compute-0 podman[148529]: 2025-10-08 19:12:04.142922969 +0000 UTC m=+0.056110035 container remove a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.151 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0705cef5-5347-423c-81a1-907173043981]: (4, ('Wed Oct  8 07:12:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 (a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc)\na10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc\nWed Oct  8 07:12:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 (a10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc)\na10c865d8f33b87218a52fa6d5b32a88534c0d9b9c0a166b926f6b4b8ef386fc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.153 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f32e71f6-abf7-41a8-a869-1c2825262be4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.153 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0d073e98-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:04 compute-0 kernel: tap0d073e98-c0: left promiscuous mode
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.179 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.184 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[02956841-c220-44fa-990d-24034cc7f54a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.210 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b9ac2b8d-b441-4549-b739-15143ce2b8ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.211 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e961ab-fd12-46e2-a24b-74511d41797c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.226 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3e033ace-347b-4a76-bb6e-4c942769317e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 126810, 'reachable_time': 22549, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 148544, 'error': None, 'target': 'ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.228 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0d073e98-c9f2-4b90-8237-84ff2fa99090 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:12:04 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:04.228 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4393d4-631e-4003-97ce-23b78cf3a537]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:04 compute-0 systemd[1]: run-netns-ovnmeta\x2d0d073e98\x2dc9f2\x2d4b90\x2d8237\x2d84ff2fa99090.mount: Deactivated successfully.
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.719 2 DEBUG nova.network.neutron [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.737 2 INFO nova.compute.manager [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Took 0.72 seconds to deallocate network for instance.#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.779 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.779 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.851 2 DEBUG nova.compute.provider_tree [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.868 2 DEBUG nova.scheduler.client.report [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.890 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.917 2 INFO nova.scheduler.client.report [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 783f8889-2bc8-4641-bdb9-95ee4226a2fd#033[00m
Oct  8 19:12:04 compute-0 nova_compute[117514]: 2025-10-08 19:12:04.974 2 DEBUG oslo_concurrency.lockutils [None req-dca986c0-27e2-4eea-a468-53c725fbf59f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.076 2 DEBUG nova.network.neutron [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updated VIF entry in instance network info cache for port bfb32e9e-52b6-4043-b9a6-129d11fa2814. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.076 2 DEBUG nova.network.neutron [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Updating instance_info_cache with network_info: [{"id": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "address": "fa:16:3e:4e:85:2e", "network": {"id": "0d073e98-c9f2-4b90-8237-84ff2fa99090", "bridge": "br-int", "label": "tempest-network-smoke--1785011615", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfb32e9e-52", "ovs_interfaceid": "bfb32e9e-52b6-4043-b9a6-129d11fa2814", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.093 2 DEBUG oslo_concurrency.lockutils [req-de0625b8-e44f-4bc1-a966-d6a52f2b1e9d req-c6626794-43f7-41b2-9e2f-64e413f363b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-783f8889-2bc8-4641-bdb9-95ee4226a2fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.502 2 DEBUG nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-unplugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 DEBUG nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-unplugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.503 2 WARNING nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-unplugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG oslo_concurrency.lockutils [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "783f8889-2bc8-4641-bdb9-95ee4226a2fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.504 2 DEBUG nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] No waiting events found dispatching network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.505 2 WARNING nova.compute.manager [req-740c0679-76dc-4002-8c68-aea806955f3f req-0f3bf598-edac-459f-9477-f1201e9d97f5 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received unexpected event network-vif-plugged-bfb32e9e-52b6-4043-b9a6-129d11fa2814 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:12:05 compute-0 nova_compute[117514]: 2025-10-08 19:12:05.758 2 DEBUG nova.compute.manager [req-e86a8bd8-fa0c-4a22-ac7c-911cbd94655a req-a99c0ac0-97a1-4fb1-8cb7-0079fc7a690d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Received event network-vif-deleted-bfb32e9e-52b6-4043-b9a6-129d11fa2814 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:06 compute-0 nova_compute[117514]: 2025-10-08 19:12:06.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:06 compute-0 podman[148547]: 2025-10-08 19:12:06.689340659 +0000 UTC m=+0.089018837 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Oct  8 19:12:06 compute-0 podman[148545]: 2025-10-08 19:12:06.698988248 +0000 UTC m=+0.116542363 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid)
Oct  8 19:12:06 compute-0 podman[148546]: 2025-10-08 19:12:06.738472711 +0000 UTC m=+0.144543064 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.746 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.748 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.932 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.990 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.992 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6096MB free_disk=73.41395950317383GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.992 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:07 compute-0 nova_compute[117514]: 2025-10-08 19:12:07.993 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.059 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.060 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.084 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.101 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.124 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.124 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:12:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:12:08 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:08.437 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:08 compute-0 nova_compute[117514]: 2025-10-08 19:12:08.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.121 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.121 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.122 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.747 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.748 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:09 compute-0 nova_compute[117514]: 2025-10-08 19:12:09.748 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:11 compute-0 nova_compute[117514]: 2025-10-08 19:12:11.169 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:12 compute-0 nova_compute[117514]: 2025-10-08 19:12:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:12 compute-0 nova_compute[117514]: 2025-10-08 19:12:12.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:12:13 compute-0 nova_compute[117514]: 2025-10-08 19:12:13.239 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950718.237962, 6af51230-93a7-45ef-9a1e-c47302f43bcf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:12:13 compute-0 nova_compute[117514]: 2025-10-08 19:12:13.240 2 INFO nova.compute.manager [-] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:12:13 compute-0 nova_compute[117514]: 2025-10-08 19:12:13.266 2 DEBUG nova.compute.manager [None req-6015f091-6ebe-41ee-a8e3-9bae346d4c2f - - - - - -] [instance: 6af51230-93a7-45ef-9a1e-c47302f43bcf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:12:13 compute-0 nova_compute[117514]: 2025-10-08 19:12:13.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:15 compute-0 podman[148613]: 2025-10-08 19:12:15.674344477 +0000 UTC m=+0.086946507 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:12:16 compute-0 nova_compute[117514]: 2025-10-08 19:12:16.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:18 compute-0 nova_compute[117514]: 2025-10-08 19:12:18.920 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950723.9176078, 783f8889-2bc8-4641-bdb9-95ee4226a2fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:12:18 compute-0 nova_compute[117514]: 2025-10-08 19:12:18.921 2 INFO nova.compute.manager [-] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:12:18 compute-0 nova_compute[117514]: 2025-10-08 19:12:18.947 2 DEBUG nova.compute.manager [None req-99b0ff47-cd86-4520-b735-4251f68d6515 - - - - - -] [instance: 783f8889-2bc8-4641-bdb9-95ee4226a2fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:12:18 compute-0 nova_compute[117514]: 2025-10-08 19:12:18.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:21 compute-0 nova_compute[117514]: 2025-10-08 19:12:21.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:23 compute-0 nova_compute[117514]: 2025-10-08 19:12:23.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:25 compute-0 podman[148637]: 2025-10-08 19:12:25.636637426 +0000 UTC m=+0.060350248 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.vendor=CentOS)
Oct  8 19:12:26 compute-0 nova_compute[117514]: 2025-10-08 19:12:26.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.141 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.142 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.159 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.260 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.261 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.272 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.273 2 INFO nova.compute.claims [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.377 2 DEBUG nova.compute.provider_tree [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.400 2 DEBUG nova.scheduler.client.report [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.427 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.428 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.517 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.518 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.535 2 INFO nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.555 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.632 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.634 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.634 2 INFO nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Creating image(s)#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.635 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.635 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.636 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.649 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.715 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.716 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.717 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.733 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.793 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.794 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.843 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.844 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.845 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.897 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.899 2 DEBUG nova.virt.disk.api [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.899 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.954 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.956 2 DEBUG nova.virt.disk.api [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.956 2 DEBUG nova.objects.instance [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 9064d639-63f3-422f-a67f-7a4dad8d2182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.981 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.982 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Ensure instance console log exists: /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.982 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.983 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:28 compute-0 nova_compute[117514]: 2025-10-08 19:12:28.983 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:29 compute-0 nova_compute[117514]: 2025-10-08 19:12:29.338 2 DEBUG nova.policy [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.321 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Successfully updated port: 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.336 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.336 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.337 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.404 2 DEBUG nova.compute.manager [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.405 2 DEBUG nova.compute.manager [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Refreshing instance network info cache due to event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.405 2 DEBUG oslo_concurrency.lockutils [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:12:31 compute-0 nova_compute[117514]: 2025-10-08 19:12:31.466 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:12:31 compute-0 podman[148673]: 2025-10-08 19:12:31.683053862 +0000 UTC m=+0.094780874 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:12:31 compute-0 podman[148672]: 2025-10-08 19:12:31.688879641 +0000 UTC m=+0.099758018 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.614 2 DEBUG nova.network.neutron [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.642 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.642 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance network_info: |[{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.643 2 DEBUG oslo_concurrency.lockutils [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.643 2 DEBUG nova.network.neutron [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Refreshing network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.646 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start _get_guest_xml network_info=[{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.652 2 WARNING nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.659 2 DEBUG nova.virt.libvirt.host [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.660 2 DEBUG nova.virt.libvirt.host [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.663 2 DEBUG nova.virt.libvirt.host [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.664 2 DEBUG nova.virt.libvirt.host [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.664 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.664 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.665 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.665 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.665 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.666 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.666 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.666 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.666 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.667 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.667 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.667 2 DEBUG nova.virt.hardware [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.671 2 DEBUG nova.virt.libvirt.vif [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:12:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2063989498',display_name='tempest-TestNetworkBasicOps-server-2063989498',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2063989498',id=8,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHrmLiAZgqvVf5RfnV+fV+6NQU3NOHOaBSGPHYj+myBWF2AbxEIt6spK8FUlXi8r+736xE5lbIw3NTujAKkT/2AVTAI40/9ASURZfXUfcM5xxB2Et9shqsazA/r0h6yOw==',key_name='tempest-TestNetworkBasicOps-2016672130',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-v8tc0ebw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:12:28Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=9064d639-63f3-422f-a67f-7a4dad8d2182,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.671 2 DEBUG nova.network.os_vif_util [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.672 2 DEBUG nova.network.os_vif_util [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.673 2 DEBUG nova.objects.instance [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9064d639-63f3-422f-a67f-7a4dad8d2182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.687 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <uuid>9064d639-63f3-422f-a67f-7a4dad8d2182</uuid>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <name>instance-00000008</name>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-2063989498</nova:name>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:12:32</nova:creationTime>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:12:32 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:12:32 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:12:32 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:12:32 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:12:32 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:12:32 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:12:32 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:12:32 compute-0 nova_compute[117514]:        <nova:port uuid="8b1cf032-8f00-4a3b-a370-211b5b0ca4ce">
Oct  8 19:12:32 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <system>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <entry name="serial">9064d639-63f3-422f-a67f-7a4dad8d2182</entry>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <entry name="uuid">9064d639-63f3-422f-a67f-7a4dad8d2182</entry>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </system>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <os>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  </os>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <features>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  </features>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.config"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:78:c5:0e"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <target dev="tap8b1cf032-8f"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/console.log" append="off"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <video>
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </video>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:12:32 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:12:32 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:12:32 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:12:32 compute-0 nova_compute[117514]: </domain>
Oct  8 19:12:32 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.688 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Preparing to wait for external event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.689 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.689 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.689 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.690 2 DEBUG nova.virt.libvirt.vif [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:12:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2063989498',display_name='tempest-TestNetworkBasicOps-server-2063989498',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2063989498',id=8,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHrmLiAZgqvVf5RfnV+fV+6NQU3NOHOaBSGPHYj+myBWF2AbxEIt6spK8FUlXi8r+736xE5lbIw3NTujAKkT/2AVTAI40/9ASURZfXUfcM5xxB2Et9shqsazA/r0h6yOw==',key_name='tempest-TestNetworkBasicOps-2016672130',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-v8tc0ebw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:12:28Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=9064d639-63f3-422f-a67f-7a4dad8d2182,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.690 2 DEBUG nova.network.os_vif_util [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.691 2 DEBUG nova.network.os_vif_util [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.691 2 DEBUG os_vif [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.692 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.693 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.695 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b1cf032-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.695 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8b1cf032-8f, col_values=(('external_ids', {'iface-id': '8b1cf032-8f00-4a3b-a370-211b5b0ca4ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:c5:0e', 'vm-uuid': '9064d639-63f3-422f-a67f-7a4dad8d2182'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:32 compute-0 NetworkManager[1035]: <info>  [1759950752.6979] manager: (tap8b1cf032-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.709 2 INFO os_vif [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f')#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.782 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.782 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.783 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:78:c5:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:12:32 compute-0 nova_compute[117514]: 2025-10-08 19:12:32.783 2 INFO nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Using config drive#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.300 2 INFO nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Creating config drive at /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.config#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.309 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmqm54ocd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.445 2 DEBUG oslo_concurrency.processutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmqm54ocd" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:33 compute-0 kernel: tap8b1cf032-8f: entered promiscuous mode
Oct  8 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.5090] manager: (tap8b1cf032-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00109|binding|INFO|Claiming lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for this chassis.
Oct  8 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00110|binding|INFO|8b1cf032-8f00-4a3b-a370-211b5b0ca4ce: Claiming fa:16:3e:78:c5:0e 10.100.0.4
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.532 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:c5:0e 10.100.0.4'], port_security=['fa:16:3e:78:c5:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9064d639-63f3-422f-a67f-7a4dad8d2182', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3717a1-7175-40f6-8720-19b5f1d50c4f, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.533 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce in datapath 9bec1de5-8be3-4df6-b90a-943d76fedc48 bound to our chassis#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.534 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bec1de5-8be3-4df6-b90a-943d76fedc48#033[00m
Oct  8 19:12:33 compute-0 systemd-udevd[148732]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.549 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aeb45690-ef70-41e0-b164-bbb5d3fcee3e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.550 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bec1de5-81 in ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:12:33 compute-0 systemd-machined[77568]: New machine qemu-8-instance-00000008.
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.559 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bec1de5-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.559 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[63d1c861-d141-4e44-852b-3bab02309076]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.560 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[008dd50a-9519-40e7-90ad-6622bc285207]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.5684] device (tap8b1cf032-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.5693] device (tap8b1cf032-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00111|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce ovn-installed in OVS
Oct  8 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00112|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce up in Southbound
Oct  8 19:12:33 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.576 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e6ae86-d9fc-4a23-9146-06b03f2bf117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.594 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ef14f539-b71d-4e95-b0a4-e5cd732e1311]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.630 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[c0bdc9b1-5747-4cf2-ae9c-b7cbc8e658eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.638 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[dacd09f6-344b-44c9-b220-d876cedff3d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.6392] manager: (tap9bec1de5-80): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.679 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b2cfef-e96d-4400-892d-927616bd7a4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.684 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[28b28539-a78a-4cb4-a0f6-400f6d2e35ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.7151] device (tap9bec1de5-80): carrier: link connected
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.722 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[fe335248-3986-43e5-a4da-00acbea1de26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.746 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9579fe01-10ae-4f6d-a543-4122cfae2b16]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bec1de5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ad:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 142429, 'reachable_time': 23969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 148766, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.766 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[7cd07b92-8e2d-450b-95f2-08ed663d3969]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee5:ad60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 142429, 'tstamp': 142429}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 148767, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.789 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[52677218-a51f-42dc-81ae-dac36394f6a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bec1de5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ad:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 142429, 'reachable_time': 23969, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 148768, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.796 2 DEBUG nova.compute.manager [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.796 2 DEBUG oslo_concurrency.lockutils [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.797 2 DEBUG oslo_concurrency.lockutils [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.797 2 DEBUG oslo_concurrency.lockutils [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.797 2 DEBUG nova.compute.manager [req-0ce651f1-aed1-4de0-9c42-7bc1cb315a1e req-56478a7b-2765-4d72-841d-79e1ef4148ac bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Processing event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.841 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[12caad1a-e216-4e4f-ad0a-25bfdf2e3331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.931 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6c9e2602-907e-4673-bb62-a2ec470764ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.933 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bec1de5-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.933 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.934 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bec1de5-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:33 compute-0 kernel: tap9bec1de5-80: entered promiscuous mode
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:33 compute-0 NetworkManager[1035]: <info>  [1759950753.9367] manager: (tap9bec1de5-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.942 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bec1de5-80, col_values=(('external_ids', {'iface-id': 'b57e5c57-68fb-43de-8f87-98a853dc8be7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:33 compute-0 ovn_controller[19759]: 2025-10-08T19:12:33Z|00113|binding|INFO|Releasing lport b57e5c57-68fb-43de-8f87-98a853dc8be7 from this chassis (sb_readonly=0)
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.944 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.945 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[61394c4e-c729-4daf-8f54-2c1c4d8d8f39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.946 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID 9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:12:33 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:33.947 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'env', 'PROCESS_TAG=haproxy-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bec1de5-8be3-4df6-b90a-943d76fedc48.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:12:33 compute-0 nova_compute[117514]: 2025-10-08 19:12:33.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.282 2 DEBUG nova.network.neutron [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updated VIF entry in instance network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.282 2 DEBUG nova.network.neutron [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:12:34 compute-0 podman[148800]: 2025-10-08 19:12:34.292727842 +0000 UTC m=+0.056915489 container create ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.296 2 DEBUG oslo_concurrency.lockutils [req-cb2ce92c-f7b2-40ba-a5c9-6eed785f55e5 req-1705a742-acbb-4a77-8261-c18cc1ae9158 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:12:34 compute-0 systemd[1]: Started libpod-conmon-ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987.scope.
Oct  8 19:12:34 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:12:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fbecf7fadeecdbbbc2fa0c43982a21d5c67a25c8cf877db7615c2f2cd48a1b6c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:12:34 compute-0 podman[148800]: 2025-10-08 19:12:34.265274317 +0000 UTC m=+0.029462014 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:12:34 compute-0 podman[148800]: 2025-10-08 19:12:34.371208713 +0000 UTC m=+0.135396400 container init ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:12:34 compute-0 podman[148800]: 2025-10-08 19:12:34.37802458 +0000 UTC m=+0.142212277 container start ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:12:34 compute-0 podman[148813]: 2025-10-08 19:12:34.396077012 +0000 UTC m=+0.059239965 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:12:34 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [NOTICE]   (148842) : New worker (148852) forked
Oct  8 19:12:34 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [NOTICE]   (148842) : Loading success.
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.844 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.846 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950754.8441575, 9064d639-63f3-422f-a67f-7a4dad8d2182 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.847 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] VM Started (Lifecycle Event)#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.854 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.858 2 INFO nova.virt.libvirt.driver [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance spawned successfully.#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.859 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.870 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.874 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.885 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.885 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.886 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.886 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.887 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.887 2 DEBUG nova.virt.libvirt.driver [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.895 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.895 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950754.844457, 9064d639-63f3-422f-a67f-7a4dad8d2182 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.896 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.945 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.950 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950754.8507743, 9064d639-63f3-422f-a67f-7a4dad8d2182 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.950 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.978 2 INFO nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Took 6.34 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.978 2 DEBUG nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.980 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:12:34 compute-0 nova_compute[117514]: 2025-10-08 19:12:34.987 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.035 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.062 2 INFO nova.compute.manager [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Took 6.84 seconds to build instance.#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.079 2 DEBUG oslo_concurrency.lockutils [None req-ccf31d3f-58e0-4524-a03a-00f8e4726d36 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.891 2 DEBUG nova.compute.manager [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.892 2 DEBUG oslo_concurrency.lockutils [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.892 2 DEBUG oslo_concurrency.lockutils [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.892 2 DEBUG oslo_concurrency.lockutils [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.893 2 DEBUG nova.compute.manager [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] No waiting events found dispatching network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:12:35 compute-0 nova_compute[117514]: 2025-10-08 19:12:35.893 2 WARNING nova.compute.manager [req-ca939b36-bf91-4426-b3ea-e6a993bfc236 req-70b82259-141b-487a-947f-76dde449f332 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received unexpected event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with vm_state active and task_state None.#033[00m
Oct  8 19:12:36 compute-0 nova_compute[117514]: 2025-10-08 19:12:36.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:37 compute-0 podman[148862]: 2025-10-08 19:12:37.647807593 +0000 UTC m=+0.063178739 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:12:37 compute-0 podman[148863]: 2025-10-08 19:12:37.6715654 +0000 UTC m=+0.085231017 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  8 19:12:37 compute-0 podman[148864]: 2025-10-08 19:12:37.680287763 +0000 UTC m=+0.084911728 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:12:37 compute-0 nova_compute[117514]: 2025-10-08 19:12:37.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:38 compute-0 ovn_controller[19759]: 2025-10-08T19:12:38Z|00114|binding|INFO|Releasing lport b57e5c57-68fb-43de-8f87-98a853dc8be7 from this chassis (sb_readonly=0)
Oct  8 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:38 compute-0 NetworkManager[1035]: <info>  [1759950758.4938] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Oct  8 19:12:38 compute-0 NetworkManager[1035]: <info>  [1759950758.4950] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Oct  8 19:12:38 compute-0 ovn_controller[19759]: 2025-10-08T19:12:38Z|00115|binding|INFO|Releasing lport b57e5c57-68fb-43de-8f87-98a853dc8be7 from this chassis (sb_readonly=0)
Oct  8 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.766 2 DEBUG nova.compute.manager [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.767 2 DEBUG nova.compute.manager [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Refreshing instance network info cache due to event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.768 2 DEBUG oslo_concurrency.lockutils [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.768 2 DEBUG oslo_concurrency.lockutils [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:12:38 compute-0 nova_compute[117514]: 2025-10-08 19:12:38.768 2 DEBUG nova.network.neutron [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Refreshing network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.300 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.301 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.302 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.302 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.302 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.304 2 INFO nova.compute.manager [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Terminating instance#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.305 2 DEBUG nova.compute.manager [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:12:39 compute-0 kernel: tap8b1cf032-8f (unregistering): left promiscuous mode
Oct  8 19:12:39 compute-0 NetworkManager[1035]: <info>  [1759950759.3295] device (tap8b1cf032-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:39 compute-0 ovn_controller[19759]: 2025-10-08T19:12:39Z|00116|binding|INFO|Releasing lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce from this chassis (sb_readonly=0)
Oct  8 19:12:39 compute-0 ovn_controller[19759]: 2025-10-08T19:12:39Z|00117|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce down in Southbound
Oct  8 19:12:39 compute-0 ovn_controller[19759]: 2025-10-08T19:12:39Z|00118|binding|INFO|Removing iface tap8b1cf032-8f ovn-installed in OVS
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.393 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:c5:0e 10.100.0.4'], port_security=['fa:16:3e:78:c5:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9064d639-63f3-422f-a67f-7a4dad8d2182', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3717a1-7175-40f6-8720-19b5f1d50c4f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.394 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce in datapath 9bec1de5-8be3-4df6-b90a-943d76fedc48 unbound from our chassis#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.395 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bec1de5-8be3-4df6-b90a-943d76fedc48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.396 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8a0b3e-8156-456f-b15f-930cc05fda6f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.396 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 namespace which is not needed anymore#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:39 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Oct  8 19:12:39 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 5.730s CPU time.
Oct  8 19:12:39 compute-0 systemd-machined[77568]: Machine qemu-8-instance-00000008 terminated.
Oct  8 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [NOTICE]   (148842) : haproxy version is 2.8.14-c23fe91
Oct  8 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [NOTICE]   (148842) : path to executable is /usr/sbin/haproxy
Oct  8 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [WARNING]  (148842) : Exiting Master process...
Oct  8 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [ALERT]    (148842) : Current worker (148852) exited with code 143 (Terminated)
Oct  8 19:12:39 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[148818]: [WARNING]  (148842) : All workers exited. Exiting... (0)
Oct  8 19:12:39 compute-0 systemd[1]: libpod-ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987.scope: Deactivated successfully.
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:39 compute-0 podman[148949]: 2025-10-08 19:12:39.539423055 +0000 UTC m=+0.052623744 container died ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:12:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987-userdata-shm.mount: Deactivated successfully.
Oct  8 19:12:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-fbecf7fadeecdbbbc2fa0c43982a21d5c67a25c8cf877db7615c2f2cd48a1b6c-merged.mount: Deactivated successfully.
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.570 2 INFO nova.virt.libvirt.driver [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Instance destroyed successfully.#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.571 2 DEBUG nova.objects.instance [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 9064d639-63f3-422f-a67f-7a4dad8d2182 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.586 2 DEBUG nova.virt.libvirt.vif [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:12:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2063989498',display_name='tempest-TestNetworkBasicOps-server-2063989498',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2063989498',id=8,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHrmLiAZgqvVf5RfnV+fV+6NQU3NOHOaBSGPHYj+myBWF2AbxEIt6spK8FUlXi8r+736xE5lbIw3NTujAKkT/2AVTAI40/9ASURZfXUfcM5xxB2Et9shqsazA/r0h6yOw==',key_name='tempest-TestNetworkBasicOps-2016672130',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:12:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-v8tc0ebw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:12:35Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=9064d639-63f3-422f-a67f-7a4dad8d2182,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.586 2 DEBUG nova.network.os_vif_util [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.587 2 DEBUG nova.network.os_vif_util [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:12:39 compute-0 podman[148949]: 2025-10-08 19:12:39.588075793 +0000 UTC m=+0.101276482 container cleanup ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.588 2 DEBUG os_vif [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.591 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b1cf032-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:39 compute-0 systemd[1]: libpod-conmon-ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987.scope: Deactivated successfully.
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.600 2 INFO os_vif [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f')#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.601 2 INFO nova.virt.libvirt.driver [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Deleting instance files /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182_del#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.601 2 INFO nova.virt.libvirt.driver [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Deletion of /var/lib/nova/instances/9064d639-63f3-422f-a67f-7a4dad8d2182_del complete#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.644 2 INFO nova.compute.manager [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.644 2 DEBUG oslo.service.loopingcall [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.645 2 DEBUG nova.compute.manager [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.645 2 DEBUG nova.network.neutron [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:12:39 compute-0 podman[148995]: 2025-10-08 19:12:39.661173768 +0000 UTC m=+0.049059620 container remove ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.666 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ce18e400-42f9-4b06-9ba0-13c0b339c46b]: (4, ('Wed Oct  8 07:12:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 (ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987)\nba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987\nWed Oct  8 07:12:39 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 (ba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987)\nba509af1b1a2aacf4afdbcc1de97c2dadc543be61aff900e5b38b6bdc5419987\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.668 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9422a892-d616-4c92-8bec-fdc24d645473]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.669 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bec1de5-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:39 compute-0 kernel: tap9bec1de5-80: left promiscuous mode
Oct  8 19:12:39 compute-0 nova_compute[117514]: 2025-10-08 19:12:39.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.687 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[879ca788-7422-4c25-be1d-8c78205272e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.714 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba0e8d3-3dca-4473-9a50-8e2b9a8ee08e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.716 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[be756035-e6cb-48c5-adff-fa6948d2ad0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.733 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e9d498-e4f9-4d4d-a8a9-cd8d47add5b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 142420, 'reachable_time': 41214, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149008, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.736 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:12:39 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:39.736 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[46ab0e75-87fa-46d8-9aee-b7d3674d2e04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:12:39 compute-0 systemd[1]: run-netns-ovnmeta\x2d9bec1de5\x2d8be3\x2d4df6\x2db90a\x2d943d76fedc48.mount: Deactivated successfully.
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.516 2 DEBUG nova.network.neutron [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updated VIF entry in instance network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.517 2 DEBUG nova.network.neutron [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.535 2 DEBUG oslo_concurrency.lockutils [req-e555e061-1370-4fc8-8d40-990836091950 req-af444cc9-8c48-4315-b507-317241b7a6da bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-9064d639-63f3-422f-a67f-7a4dad8d2182" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.863 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.864 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.864 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.864 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.864 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] No waiting events found dispatching network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.865 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.865 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.865 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.865 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.866 2 DEBUG oslo_concurrency.lockutils [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.866 2 DEBUG nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] No waiting events found dispatching network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:12:40 compute-0 nova_compute[117514]: 2025-10-08 19:12:40.866 2 WARNING nova.compute.manager [req-c3545de3-01b2-499f-b8ec-abcc85d9d8f8 req-cc76a44f-b701-4ddb-8780-7233c49c4a62 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Received unexpected event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with vm_state active and task_state deleting.#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.242 2 DEBUG nova.network.neutron [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.263 2 INFO nova.compute.manager [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Took 1.62 seconds to deallocate network for instance.#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.307 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.308 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.379 2 DEBUG nova.compute.provider_tree [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.393 2 DEBUG nova.scheduler.client.report [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.412 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.436 2 INFO nova.scheduler.client.report [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 9064d639-63f3-422f-a67f-7a4dad8d2182#033[00m
Oct  8 19:12:41 compute-0 nova_compute[117514]: 2025-10-08 19:12:41.498 2 DEBUG oslo_concurrency.lockutils [None req-60f80fbc-66cb-4655-b02c-26b1c76aa351 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "9064d639-63f3-422f-a67f-7a4dad8d2182" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:44.233 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:44.234 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:12:44.234 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:44 compute-0 nova_compute[117514]: 2025-10-08 19:12:44.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:46 compute-0 nova_compute[117514]: 2025-10-08 19:12:46.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:46 compute-0 podman[149010]: 2025-10-08 19:12:46.642692007 +0000 UTC m=+0.063698585 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 19:12:49 compute-0 nova_compute[117514]: 2025-10-08 19:12:49.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:51 compute-0 nova_compute[117514]: 2025-10-08 19:12:51.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.848 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.848 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.870 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.943 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.943 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.954 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:12:53 compute-0 nova_compute[117514]: 2025-10-08 19:12:53.954 2 INFO nova.compute.claims [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.065 2 DEBUG nova.compute.provider_tree [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.079 2 DEBUG nova.scheduler.client.report [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.100 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.101 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.150 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.151 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.174 2 INFO nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.194 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.282 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.283 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.284 2 INFO nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Creating image(s)#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.284 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.285 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.285 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.296 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.388 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.390 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.391 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.413 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.492 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.493 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.524 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.526 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.527 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.568 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950759.5666447, 9064d639-63f3-422f-a67f-7a4dad8d2182 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.569 2 INFO nova.compute.manager [-] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.576 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.576 2 DEBUG nova.virt.disk.api [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.577 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.595 2 DEBUG nova.compute.manager [None req-487306c8-8730-4489-b337-0fdab4299ba4 - - - - - -] [instance: 9064d639-63f3-422f-a67f-7a4dad8d2182] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.626 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.628 2 DEBUG nova.virt.disk.api [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.628 2 DEBUG nova.objects.instance [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 093d721c-61cb-4fd3-b678-7465d8840cc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.656 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.657 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Ensure instance console log exists: /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.657 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.658 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:54 compute-0 nova_compute[117514]: 2025-10-08 19:12:54.659 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:55 compute-0 nova_compute[117514]: 2025-10-08 19:12:55.340 2 DEBUG nova.policy [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:12:56 compute-0 nova_compute[117514]: 2025-10-08 19:12:56.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:56 compute-0 podman[149049]: 2025-10-08 19:12:56.652639524 +0000 UTC m=+0.072720309 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001)
Oct  8 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.295 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Successfully updated port: 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.308 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.308 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.308 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.390 2 DEBUG nova.compute.manager [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.390 2 DEBUG nova.compute.manager [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Refreshing instance network info cache due to event network-changed-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.390 2 DEBUG oslo_concurrency.lockutils [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:12:57 compute-0 nova_compute[117514]: 2025-10-08 19:12:57.451 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.171 2 DEBUG nova.network.neutron [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.193 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.194 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance network_info: |[{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.195 2 DEBUG oslo_concurrency.lockutils [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.196 2 DEBUG nova.network.neutron [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Refreshing network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.203 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start _get_guest_xml network_info=[{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.210 2 WARNING nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.219 2 DEBUG nova.virt.libvirt.host [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.220 2 DEBUG nova.virt.libvirt.host [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.226 2 DEBUG nova.virt.libvirt.host [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.227 2 DEBUG nova.virt.libvirt.host [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.227 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.228 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.228 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.229 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.229 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.229 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.229 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.230 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.230 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.230 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.231 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.231 2 DEBUG nova.virt.hardware [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.235 2 DEBUG nova.virt.libvirt.vif [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1974827177',display_name='tempest-TestNetworkBasicOps-server-1974827177',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1974827177',id=9,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyoze6rxyNRjrKnZ/n+vsTth9kzwYzz/7DU1WtoejT8IDCjJBZl23bG4N5vxWcqQprun8odMD7xEnPv//MudkIlq44roa1e3u7lgMT8KOfJfpcO6Gbpp6ERjS4fOIF90w==',key_name='tempest-TestNetworkBasicOps-106019254',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-kf002v69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:12:54Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=093d721c-61cb-4fd3-b678-7465d8840cc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.236 2 DEBUG nova.network.os_vif_util [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.236 2 DEBUG nova.network.os_vif_util [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.237 2 DEBUG nova.objects.instance [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 093d721c-61cb-4fd3-b678-7465d8840cc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.254 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <uuid>093d721c-61cb-4fd3-b678-7465d8840cc6</uuid>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <name>instance-00000009</name>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-1974827177</nova:name>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:12:59</nova:creationTime>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:12:59 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:12:59 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:12:59 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:12:59 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:12:59 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:12:59 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:12:59 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:12:59 compute-0 nova_compute[117514]:        <nova:port uuid="8b1cf032-8f00-4a3b-a370-211b5b0ca4ce">
Oct  8 19:12:59 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <system>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <entry name="serial">093d721c-61cb-4fd3-b678-7465d8840cc6</entry>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <entry name="uuid">093d721c-61cb-4fd3-b678-7465d8840cc6</entry>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </system>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <os>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  </os>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <features>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  </features>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.config"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:78:c5:0e"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <target dev="tap8b1cf032-8f"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/console.log" append="off"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <video>
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </video>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:12:59 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:12:59 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:12:59 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:12:59 compute-0 nova_compute[117514]: </domain>
Oct  8 19:12:59 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.256 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Preparing to wait for external event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.256 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.256 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.256 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.257 2 DEBUG nova.virt.libvirt.vif [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1974827177',display_name='tempest-TestNetworkBasicOps-server-1974827177',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1974827177',id=9,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyoze6rxyNRjrKnZ/n+vsTth9kzwYzz/7DU1WtoejT8IDCjJBZl23bG4N5vxWcqQprun8odMD7xEnPv//MudkIlq44roa1e3u7lgMT8KOfJfpcO6Gbpp6ERjS4fOIF90w==',key_name='tempest-TestNetworkBasicOps-106019254',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-kf002v69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:12:54Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=093d721c-61cb-4fd3-b678-7465d8840cc6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.258 2 DEBUG nova.network.os_vif_util [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.258 2 DEBUG nova.network.os_vif_util [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.259 2 DEBUG os_vif [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b1cf032-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8b1cf032-8f, col_values=(('external_ids', {'iface-id': '8b1cf032-8f00-4a3b-a370-211b5b0ca4ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:c5:0e', 'vm-uuid': '093d721c-61cb-4fd3-b678-7465d8840cc6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:59 compute-0 NetworkManager[1035]: <info>  [1759950779.2676] manager: (tap8b1cf032-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.276 2 INFO os_vif [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f')#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.457 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.458 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.458 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:78:c5:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:12:59 compute-0 nova_compute[117514]: 2025-10-08 19:12:59.459 2 INFO nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Using config drive#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.221 2 INFO nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Creating config drive at /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.config#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.227 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ns4exzo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.349 2 DEBUG oslo_concurrency.processutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4ns4exzo" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:13:00 compute-0 kernel: tap8b1cf032-8f: entered promiscuous mode
Oct  8 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.4275] manager: (tap8b1cf032-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Oct  8 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00119|binding|INFO|Claiming lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for this chassis.
Oct  8 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00120|binding|INFO|8b1cf032-8f00-4a3b-a370-211b5b0ca4ce: Claiming fa:16:3e:78:c5:0e 10.100.0.4
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:00 compute-0 systemd-udevd[149089]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.454 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:c5:0e 10.100.0.4'], port_security=['fa:16:3e:78:c5:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '093d721c-61cb-4fd3-b678-7465d8840cc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3717a1-7175-40f6-8720-19b5f1d50c4f, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00121|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce ovn-installed in OVS
Oct  8 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00122|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce up in Southbound
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.455 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce in datapath 9bec1de5-8be3-4df6-b90a-943d76fedc48 bound to our chassis#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.456 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9bec1de5-8be3-4df6-b90a-943d76fedc48#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.459 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.4720] device (tap8b1cf032-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.4743] device (tap8b1cf032-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.470 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[51aade05-2ce0-457b-880d-b2cbb8b85f0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.472 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9bec1de5-81 in ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.474 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9bec1de5-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.474 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e3616fae-ec15-4429-b09f-cfd3aba805c1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.476 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4672f836-921b-4dea-805c-b7766a46359c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 systemd-machined[77568]: New machine qemu-9-instance-00000009.
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.492 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[38af38ec-9da5-4fe7-a462-ce506eef82f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.507 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0c2c9c94-3ef8-4fe3-a333-38edca5f006a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.536 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc2fb2a-9365-4a02-bc77-78a5bd529c66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.542 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ad658083-ae39-4345-bcf5-46387fe93d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.5445] manager: (tap9bec1de5-80): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.582 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[f6ba6c15-9470-47d5-881a-fe5f010c9b50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.586 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[b1560fca-b16f-44dc-b2e4-0d34bff21b12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.6090] device (tap9bec1de5-80): carrier: link connected
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.615 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[c5478c87-3d66-468c-ae6e-b03955a990ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.633 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[713db871-2f5c-4d9c-93fe-344bc10cacbb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bec1de5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ad:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 145118, 'reachable_time': 22925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149125, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.641 2 DEBUG nova.compute.manager [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.642 2 DEBUG oslo_concurrency.lockutils [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.643 2 DEBUG oslo_concurrency.lockutils [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.643 2 DEBUG oslo_concurrency.lockutils [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.644 2 DEBUG nova.compute.manager [req-493149bb-30ab-4d0c-871c-99c6e6a26035 req-2f1d66b2-4fbb-4b8a-b8ff-cb4075d0e9d4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Processing event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.656 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[08ef3c1e-b7f8-451e-8d5e-2701a30bb5ab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee5:ad60'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 145118, 'tstamp': 145118}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 149126, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.677 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f86da07c-6473-4bb5-b758-73915834b1bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9bec1de5-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e5:ad:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 145118, 'reachable_time': 22925, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 149127, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.723 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[cf575ec2-4ecd-42b4-9f36-b68e5b7fa8a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.813 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fceb1425-7f37-464b-bab4-a19c3290eca1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.816 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bec1de5-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.816 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.817 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9bec1de5-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.820 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:00 compute-0 NetworkManager[1035]: <info>  [1759950780.8207] manager: (tap9bec1de5-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Oct  8 19:13:00 compute-0 kernel: tap9bec1de5-80: entered promiscuous mode
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.823 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.825 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9bec1de5-80, col_values=(('external_ids', {'iface-id': 'b57e5c57-68fb-43de-8f87-98a853dc8be7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:00 compute-0 ovn_controller[19759]: 2025-10-08T19:13:00Z|00123|binding|INFO|Releasing lport b57e5c57-68fb-43de-8f87-98a853dc8be7 from this chassis (sb_readonly=0)
Oct  8 19:13:00 compute-0 nova_compute[117514]: 2025-10-08 19:13:00.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.851 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.852 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[910c6230-b3ce-4481-ac85-58b9189dc924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.853 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/9bec1de5-8be3-4df6-b90a-943d76fedc48.pid.haproxy
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID 9bec1de5-8be3-4df6-b90a-943d76fedc48
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:13:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:00.854 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'env', 'PROCESS_TAG=haproxy-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9bec1de5-8be3-4df6-b90a-943d76fedc48.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:01.131 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.161 2 DEBUG nova.network.neutron [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Updated VIF entry in instance network info cache for port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.161 2 DEBUG nova.network.neutron [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Updating instance_info_cache with network_info: [{"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.176 2 DEBUG oslo_concurrency.lockutils [req-ffe03cf9-9031-4704-b71b-98e12400eee4 req-1ddb67d5-c9d4-477e-a26a-1917d51ec9af bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-093d721c-61cb-4fd3-b678-7465d8840cc6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:01 compute-0 podman[149164]: 2025-10-08 19:13:01.249115905 +0000 UTC m=+0.057414577 container create bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.259 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950781.2589304, 093d721c-61cb-4fd3-b678-7465d8840cc6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.260 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] VM Started (Lifecycle Event)#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.262 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.266 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.268 2 INFO nova.virt.libvirt.driver [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance spawned successfully.#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.269 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.279 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.281 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.289 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.289 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.290 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.290 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.290 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.290 2 DEBUG nova.virt.libvirt.driver [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:01 compute-0 systemd[1]: Started libpod-conmon-bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177.scope.
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.297 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.297 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950781.261934, 093d721c-61cb-4fd3-b678-7465d8840cc6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.297 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:13:01 compute-0 podman[149164]: 2025-10-08 19:13:01.219354397 +0000 UTC m=+0.027653079 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:13:01 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.320 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dd409b632a0b96c86655809d2dc0d8db1d12c174e3e7557cccb2a1f4c341ed48/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.324 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950781.2645817, 093d721c-61cb-4fd3-b678-7465d8840cc6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.324 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.339 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:01 compute-0 podman[149164]: 2025-10-08 19:13:01.341700237 +0000 UTC m=+0.149998979 container init bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.341 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.346 2 INFO nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Took 7.06 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.347 2 DEBUG nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:01 compute-0 podman[149164]: 2025-10-08 19:13:01.352259051 +0000 UTC m=+0.160557753 container start bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 19:13:01 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [NOTICE]   (149184) : New worker (149186) forked
Oct  8 19:13:01 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [NOTICE]   (149184) : Loading success.
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.373 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.411 2 INFO nova.compute.manager [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Took 7.50 seconds to build instance.#033[00m
Oct  8 19:13:01 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:01.428 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:13:01 compute-0 nova_compute[117514]: 2025-10-08 19:13:01.430 2 DEBUG oslo_concurrency.lockutils [None req-7d9eb3bb-0cd1-4031-8fa3-a5241ba26367 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:02 compute-0 podman[149195]: 2025-10-08 19:13:02.64336497 +0000 UTC m=+0.060889628 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Oct  8 19:13:02 compute-0 podman[149196]: 2025-10-08 19:13:02.678609236 +0000 UTC m=+0.095322411 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  8 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.709 2 DEBUG nova.compute.manager [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.710 2 DEBUG oslo_concurrency.lockutils [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.710 2 DEBUG oslo_concurrency.lockutils [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.710 2 DEBUG oslo_concurrency.lockutils [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.711 2 DEBUG nova.compute.manager [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] No waiting events found dispatching network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:13:02 compute-0 nova_compute[117514]: 2025-10-08 19:13:02.711 2 WARNING nova.compute.manager [req-16252b51-7bcc-4f60-a65a-662c7cf1c589 req-b27febea-f744-41ef-bbd5-a9a022440b89 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received unexpected event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with vm_state active and task_state None.#033[00m
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.888 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.889 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.889 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.889 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.889 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.891 2 INFO nova.compute.manager [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Terminating instance#033[00m
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.892 2 DEBUG nova.compute.manager [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:13:03 compute-0 kernel: tap8b1cf032-8f (unregistering): left promiscuous mode
Oct  8 19:13:03 compute-0 NetworkManager[1035]: <info>  [1759950783.9219] device (tap8b1cf032-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:03 compute-0 ovn_controller[19759]: 2025-10-08T19:13:03Z|00124|binding|INFO|Releasing lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce from this chassis (sb_readonly=0)
Oct  8 19:13:03 compute-0 ovn_controller[19759]: 2025-10-08T19:13:03Z|00125|binding|INFO|Setting lport 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce down in Southbound
Oct  8 19:13:03 compute-0 ovn_controller[19759]: 2025-10-08T19:13:03Z|00126|binding|INFO|Removing iface tap8b1cf032-8f ovn-installed in OVS
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.943 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:c5:0e 10.100.0.4'], port_security=['fa:16:3e:78:c5:0e 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '093d721c-61cb-4fd3-b678-7465d8840cc6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-94131643', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'be57f10c-6afc-483d-a1fa-fab953b8fe3e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d3717a1-7175-40f6-8720-19b5f1d50c4f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.945 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 8b1cf032-8f00-4a3b-a370-211b5b0ca4ce in datapath 9bec1de5-8be3-4df6-b90a-943d76fedc48 unbound from our chassis#033[00m
Oct  8 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.947 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9bec1de5-8be3-4df6-b90a-943d76fedc48, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:13:03 compute-0 nova_compute[117514]: 2025-10-08 19:13:03.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.949 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[397e1090-0e30-4af6-916d-6a422311cc3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:03.950 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 namespace which is not needed anymore#033[00m
Oct  8 19:13:03 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Oct  8 19:13:03 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 3.382s CPU time.
Oct  8 19:13:03 compute-0 systemd-machined[77568]: Machine qemu-9-instance-00000009 terminated.
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.168 2 INFO nova.virt.libvirt.driver [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Instance destroyed successfully.#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.169 2 DEBUG nova.objects.instance [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 093d721c-61cb-4fd3-b678-7465d8840cc6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.184 2 DEBUG nova.virt.libvirt.vif [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:12:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1974827177',display_name='tempest-TestNetworkBasicOps-server-1974827177',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1974827177',id=9,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLyoze6rxyNRjrKnZ/n+vsTth9kzwYzz/7DU1WtoejT8IDCjJBZl23bG4N5vxWcqQprun8odMD7xEnPv//MudkIlq44roa1e3u7lgMT8KOfJfpcO6Gbpp6ERjS4fOIF90w==',key_name='tempest-TestNetworkBasicOps-106019254',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:13:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-kf002v69',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:13:01Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=093d721c-61cb-4fd3-b678-7465d8840cc6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.184 2 DEBUG nova.network.os_vif_util [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "address": "fa:16:3e:78:c5:0e", "network": {"id": "9bec1de5-8be3-4df6-b90a-943d76fedc48", "bridge": "br-int", "label": "tempest-network-smoke--1142604311", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8b1cf032-8f", "ovs_interfaceid": "8b1cf032-8f00-4a3b-a370-211b5b0ca4ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.185 2 DEBUG nova.network.os_vif_util [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.186 2 DEBUG os_vif [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b1cf032-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.217 2 INFO os_vif [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:c5:0e,bridge_name='br-int',has_traffic_filtering=True,id=8b1cf032-8f00-4a3b-a370-211b5b0ca4ce,network=Network(9bec1de5-8be3-4df6-b90a-943d76fedc48),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap8b1cf032-8f')#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.218 2 INFO nova.virt.libvirt.driver [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Deleting instance files /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6_del#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.219 2 INFO nova.virt.libvirt.driver [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Deletion of /var/lib/nova/instances/093d721c-61cb-4fd3-b678-7465d8840cc6_del complete#033[00m
Oct  8 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [NOTICE]   (149184) : haproxy version is 2.8.14-c23fe91
Oct  8 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [NOTICE]   (149184) : path to executable is /usr/sbin/haproxy
Oct  8 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [WARNING]  (149184) : Exiting Master process...
Oct  8 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [WARNING]  (149184) : Exiting Master process...
Oct  8 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [ALERT]    (149184) : Current worker (149186) exited with code 143 (Terminated)
Oct  8 19:13:04 compute-0 neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48[149180]: [WARNING]  (149184) : All workers exited. Exiting... (0)
Oct  8 19:13:04 compute-0 systemd[1]: libpod-bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177.scope: Deactivated successfully.
Oct  8 19:13:04 compute-0 podman[149254]: 2025-10-08 19:13:04.245562545 +0000 UTC m=+0.161859781 container died bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.274 2 INFO nova.compute.manager [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.275 2 DEBUG oslo.service.loopingcall [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.275 2 DEBUG nova.compute.manager [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.275 2 DEBUG nova.network.neutron [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:13:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177-userdata-shm.mount: Deactivated successfully.
Oct  8 19:13:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-dd409b632a0b96c86655809d2dc0d8db1d12c174e3e7557cccb2a1f4c341ed48-merged.mount: Deactivated successfully.
Oct  8 19:13:04 compute-0 podman[149300]: 2025-10-08 19:13:04.528039344 +0000 UTC m=+0.074784578 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 19:13:04 compute-0 podman[149254]: 2025-10-08 19:13:04.559529173 +0000 UTC m=+0.475826409 container cleanup bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:13:04 compute-0 systemd[1]: libpod-conmon-bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177.scope: Deactivated successfully.
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.952 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.953 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.954 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.954 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.955 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] No waiting events found dispatching network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.955 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-unplugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.956 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.956 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.956 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.957 2 DEBUG oslo_concurrency.lockutils [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.957 2 DEBUG nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] No waiting events found dispatching network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:13:04 compute-0 nova_compute[117514]: 2025-10-08 19:13:04.958 2 WARNING nova.compute.manager [req-d1a279e4-56d3-4bc1-abb2-051b2f2f5169 req-89caeffe-6a81-4739-bdc1-4df12137f4a8 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Received unexpected event network-vif-plugged-8b1cf032-8f00-4a3b-a370-211b5b0ca4ce for instance with vm_state active and task_state deleting.#033[00m
Oct  8 19:13:05 compute-0 podman[149333]: 2025-10-08 19:13:05.056258714 +0000 UTC m=+0.467065137 container remove bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.065 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[c7716e01-bc45-424d-9f28-89070d555139]: (4, ('Wed Oct  8 07:13:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 (bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177)\nbb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177\nWed Oct  8 07:13:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 (bb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177)\nbb9f09cd0eed6235cc64b81363d45392abfcd27382ba618570c50d74721f0177\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.068 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9f0ae40e-c206-49de-b27e-f4dacc222e56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.070 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9bec1de5-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:05 compute-0 kernel: tap9bec1de5-80: left promiscuous mode
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.094 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[27da0720-7416-4fb1-9008-346752c560d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.129 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[7b5176bf-26d5-4145-ad21-bf7ef340f614]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.130 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[081d857e-60c5-4f5e-9d33-deefbc807eb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.156 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[39aec415-f71d-4efd-9d86-7d5a8f1bcb37]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 145111, 'reachable_time': 28752, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149349, 'error': None, 'target': 'ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.160 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9bec1de5-8be3-4df6-b90a-943d76fedc48 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:13:05 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:05.160 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[45ccd5f6-7262-495c-b2db-6770589a065a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:05 compute-0 systemd[1]: run-netns-ovnmeta\x2d9bec1de5\x2d8be3\x2d4df6\x2db90a\x2d943d76fedc48.mount: Deactivated successfully.
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.474 2 DEBUG nova.network.neutron [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.492 2 INFO nova.compute.manager [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Took 1.22 seconds to deallocate network for instance.#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.534 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.535 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.601 2 DEBUG nova.compute.provider_tree [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.622 2 DEBUG nova.scheduler.client.report [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.646 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.677 2 INFO nova.scheduler.client.report [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 093d721c-61cb-4fd3-b678-7465d8840cc6#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:05 compute-0 nova_compute[117514]: 2025-10-08 19:13:05.748 2 DEBUG oslo_concurrency.lockutils [None req-3c44e3a1-ee25-4bd5-9884-5c405b1d40a3 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "093d721c-61cb-4fd3-b678-7465d8840cc6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:06 compute-0 nova_compute[117514]: 2025-10-08 19:13:06.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.739 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.739 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.740 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.740 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:13:07 compute-0 podman[149351]: 2025-10-08 19:13:07.877843118 +0000 UTC m=+0.085742134 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:13:07 compute-0 podman[149353]: 2025-10-08 19:13:07.898410582 +0000 UTC m=+0.087305850 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:13:07 compute-0 podman[149352]: 2025-10-08 19:13:07.932828065 +0000 UTC m=+0.126308995 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.993 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.996 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6069MB free_disk=73.41393280029297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.996 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:07 compute-0 nova_compute[117514]: 2025-10-08 19:13:07.997 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.061 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.061 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.084 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.098 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.120 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:13:08 compute-0 nova_compute[117514]: 2025-10-08 19:13:08.121 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:09 compute-0 nova_compute[117514]: 2025-10-08 19:13:09.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.116 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.118 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.118 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:13:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:10.431 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:13:10 compute-0 nova_compute[117514]: 2025-10-08 19:13:10.745 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:11 compute-0 nova_compute[117514]: 2025-10-08 19:13:11.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:12 compute-0 nova_compute[117514]: 2025-10-08 19:13:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:13 compute-0 nova_compute[117514]: 2025-10-08 19:13:13.714 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:13 compute-0 nova_compute[117514]: 2025-10-08 19:13:13.735 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:13:14 compute-0 nova_compute[117514]: 2025-10-08 19:13:14.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:16 compute-0 nova_compute[117514]: 2025-10-08 19:13:16.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:17 compute-0 podman[149414]: 2025-10-08 19:13:17.626665618 +0000 UTC m=+0.050586070 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 19:13:19 compute-0 nova_compute[117514]: 2025-10-08 19:13:19.167 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950784.165181, 093d721c-61cb-4fd3-b678-7465d8840cc6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:13:19 compute-0 nova_compute[117514]: 2025-10-08 19:13:19.167 2 INFO nova.compute.manager [-] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:13:19 compute-0 nova_compute[117514]: 2025-10-08 19:13:19.185 2 DEBUG nova.compute.manager [None req-d26191a6-8b0b-4c65-8c32-a1f4b8dfc372 - - - - - -] [instance: 093d721c-61cb-4fd3-b678-7465d8840cc6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:19 compute-0 nova_compute[117514]: 2025-10-08 19:13:19.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:21 compute-0 nova_compute[117514]: 2025-10-08 19:13:21.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:24 compute-0 nova_compute[117514]: 2025-10-08 19:13:24.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.501 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.501 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.521 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.735 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.736 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.744 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.744 2 INFO nova.compute.claims [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.848 2 DEBUG nova.compute.provider_tree [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.863 2 DEBUG nova.scheduler.client.report [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.891 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.892 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.946 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.947 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.968 2 INFO nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:13:26 compute-0 nova_compute[117514]: 2025-10-08 19:13:26.994 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.096 2 DEBUG nova.policy [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.110 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.112 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.113 2 INFO nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Creating image(s)#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.114 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.114 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.116 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.143 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.229 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.231 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.232 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.256 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.348 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.350 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:13:27 compute-0 podman[149449]: 2025-10-08 19:13:27.674553905 +0000 UTC m=+0.086185517 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible)
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.685 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk 1073741824" returned: 0 in 0.335s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.687 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.688 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.774 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.775 2 DEBUG nova.virt.disk.api [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.776 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.841 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.843 2 DEBUG nova.virt.disk.api [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.844 2 DEBUG nova.objects.instance [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid f6235bae-08b8-41c2-a187-92e12703dc49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.863 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.864 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Ensure instance console log exists: /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.864 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.865 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.865 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:27 compute-0 nova_compute[117514]: 2025-10-08 19:13:27.893 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Successfully created port: b5212a27-711c-427f-af17-227f961acc42 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.478 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Successfully updated port: b5212a27-711c-427f-af17-227f961acc42 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.490 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.491 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.491 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.574 2 DEBUG nova.compute.manager [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-changed-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.574 2 DEBUG nova.compute.manager [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing instance network info cache due to event network-changed-b5212a27-711c-427f-af17-227f961acc42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:13:29 compute-0 nova_compute[117514]: 2025-10-08 19:13:29.575 2 DEBUG oslo_concurrency.lockutils [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:13:30 compute-0 nova_compute[117514]: 2025-10-08 19:13:30.158 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.230 2 DEBUG nova.network.neutron [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.293 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.293 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance network_info: |[{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.294 2 DEBUG oslo_concurrency.lockutils [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.294 2 DEBUG nova.network.neutron [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing network info cache for port b5212a27-711c-427f-af17-227f961acc42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.298 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start _get_guest_xml network_info=[{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.305 2 WARNING nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.319 2 DEBUG nova.virt.libvirt.host [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.320 2 DEBUG nova.virt.libvirt.host [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.324 2 DEBUG nova.virt.libvirt.host [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.325 2 DEBUG nova.virt.libvirt.host [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.326 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.326 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.327 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.327 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.327 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.327 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.328 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.329 2 DEBUG nova.virt.hardware [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.333 2 DEBUG nova.virt.libvirt.vif [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-240293585',display_name='tempest-TestNetworkBasicOps-server-240293585',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-240293585',id=10,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/EJhQ2cVwT1bBhqwqz8VJCILEiuVe01OpwaJWr7LJzSA4TSCURQ/KKnNYCEn/1h4DXNQh6VFPnJP6UNtvndekIhyamyZMFdOa7ELSKKJb75n9Ge1ikETCgbfRbvFVTqw==',key_name='tempest-TestNetworkBasicOps-1252364033',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-3qhae7ry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:13:27Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=f6235bae-08b8-41c2-a187-92e12703dc49,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.334 2 DEBUG nova.network.os_vif_util [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.335 2 DEBUG nova.network.os_vif_util [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.336 2 DEBUG nova.objects.instance [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid f6235bae-08b8-41c2-a187-92e12703dc49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.352 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <uuid>f6235bae-08b8-41c2-a187-92e12703dc49</uuid>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <name>instance-0000000a</name>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-240293585</nova:name>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:13:31</nova:creationTime>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:13:31 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:13:31 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:13:31 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:13:31 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:13:31 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:13:31 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:13:31 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:13:31 compute-0 nova_compute[117514]:        <nova:port uuid="b5212a27-711c-427f-af17-227f961acc42">
Oct  8 19:13:31 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <system>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <entry name="serial">f6235bae-08b8-41c2-a187-92e12703dc49</entry>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <entry name="uuid">f6235bae-08b8-41c2-a187-92e12703dc49</entry>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </system>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <os>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  </os>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <features>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  </features>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.config"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:82:d1:87"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <target dev="tapb5212a27-71"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/console.log" append="off"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <video>
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </video>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:13:31 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:13:31 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:13:31 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:13:31 compute-0 nova_compute[117514]: </domain>
Oct  8 19:13:31 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.353 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Preparing to wait for external event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.354 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.354 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.354 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.355 2 DEBUG nova.virt.libvirt.vif [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-240293585',display_name='tempest-TestNetworkBasicOps-server-240293585',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-240293585',id=10,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/EJhQ2cVwT1bBhqwqz8VJCILEiuVe01OpwaJWr7LJzSA4TSCURQ/KKnNYCEn/1h4DXNQh6VFPnJP6UNtvndekIhyamyZMFdOa7ELSKKJb75n9Ge1ikETCgbfRbvFVTqw==',key_name='tempest-TestNetworkBasicOps-1252364033',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-3qhae7ry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:13:27Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=f6235bae-08b8-41c2-a187-92e12703dc49,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.355 2 DEBUG nova.network.os_vif_util [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.356 2 DEBUG nova.network.os_vif_util [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.356 2 DEBUG os_vif [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.357 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.358 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.363 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5212a27-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.364 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb5212a27-71, col_values=(('external_ids', {'iface-id': 'b5212a27-711c-427f-af17-227f961acc42', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:d1:87', 'vm-uuid': 'f6235bae-08b8-41c2-a187-92e12703dc49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:31 compute-0 NetworkManager[1035]: <info>  [1759950811.3669] manager: (tapb5212a27-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.382 2 INFO os_vif [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71')#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.445 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.445 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.445 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:82:d1:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.446 2 INFO nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Using config drive#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.698 2 INFO nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Creating config drive at /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.config#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.707 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg4dzo948 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.847 2 DEBUG oslo_concurrency.processutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg4dzo948" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:13:31 compute-0 kernel: tapb5212a27-71: entered promiscuous mode
Oct  8 19:13:31 compute-0 NetworkManager[1035]: <info>  [1759950811.9362] manager: (tapb5212a27-71): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:31 compute-0 ovn_controller[19759]: 2025-10-08T19:13:31Z|00127|binding|INFO|Claiming lport b5212a27-711c-427f-af17-227f961acc42 for this chassis.
Oct  8 19:13:31 compute-0 ovn_controller[19759]: 2025-10-08T19:13:31Z|00128|binding|INFO|b5212a27-711c-427f-af17-227f961acc42: Claiming fa:16:3e:82:d1:87 10.100.0.7
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:31 compute-0 nova_compute[117514]: 2025-10-08 19:13:31.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.960 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:d1:87 10.100.0.7'], port_security=['fa:16:3e:82:d1:87 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f6235bae-08b8-41c2-a187-92e12703dc49', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2739dafe-af3c-4b39-8e6a-f28bb373aed0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12b6fae3-2fc2-423c-bbb0-b3805950f5b3, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=b5212a27-711c-427f-af17-227f961acc42) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.963 28643 INFO neutron.agent.ovn.metadata.agent [-] Port b5212a27-711c-427f-af17-227f961acc42 in datapath 316ecc22-916e-4a30-bb08-c6bd94993bb1 bound to our chassis#033[00m
Oct  8 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.964 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 316ecc22-916e-4a30-bb08-c6bd94993bb1#033[00m
Oct  8 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.980 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b5168e12-f605-451b-8d3c-9f0604717dce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.981 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap316ecc22-91 in ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:13:31 compute-0 systemd-udevd[149495]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:13:31 compute-0 systemd-machined[77568]: New machine qemu-10-instance-0000000a.
Oct  8 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.984 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap316ecc22-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.984 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e7919a83-a6dc-4d6f-b9f9-daf6091dc6a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:31 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:31.986 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[6d549303-7563-417a-898c-aa9333be2c95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:31 compute-0 NetworkManager[1035]: <info>  [1759950811.9960] device (tapb5212a27-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:13:31 compute-0 NetworkManager[1035]: <info>  [1759950811.9969] device (tapb5212a27-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.006 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[28957b67-9abd-4753-806e-dc19d81204a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:32 compute-0 ovn_controller[19759]: 2025-10-08T19:13:32Z|00129|binding|INFO|Setting lport b5212a27-711c-427f-af17-227f961acc42 ovn-installed in OVS
Oct  8 19:13:32 compute-0 ovn_controller[19759]: 2025-10-08T19:13:32Z|00130|binding|INFO|Setting lport b5212a27-711c-427f-af17-227f961acc42 up in Southbound
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.024 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3b1f1e7f-8761-4d8e-82ca-4e2f64f2c72b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.054 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[2e860b5b-2f9b-4546-8934-7651b47012b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.061 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[46844e62-bbdd-4545-920e-bedbe76dfafe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 systemd-udevd[149498]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:13:32 compute-0 NetworkManager[1035]: <info>  [1759950812.0631] manager: (tap316ecc22-90): new Veth device (/org/freedesktop/NetworkManager/Devices/76)
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.103 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[9723aa1a-c796-43b5-b258-a5e39211c8ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.107 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[73711c02-85fb-435d-95f1-391570010264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 NetworkManager[1035]: <info>  [1759950812.1307] device (tap316ecc22-90): carrier: link connected
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.137 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[afba7f7e-b1ad-477e-8a7f-da178b4512fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.158 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1754e981-c776-4939-9e67-8b82a84488d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap316ecc22-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:18:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 148271, 'reachable_time': 43666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149527, 'error': None, 'target': 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.176 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f6c09f-a36a-4930-b697-c9ae342d719b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:1846'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 148271, 'tstamp': 148271}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 149528, 'error': None, 'target': 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.195 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[26de5419-c475-4051-a069-f624905fc9fe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap316ecc22-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:18:46'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 148271, 'reachable_time': 43666, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 149529, 'error': None, 'target': 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.233 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[87ba2427-1fec-476f-9184-a1dbdb20687b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.311 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9b236615-146f-4831-976a-7d081af421b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.313 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap316ecc22-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.314 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.314 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap316ecc22-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:32 compute-0 kernel: tap316ecc22-90: entered promiscuous mode
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:32 compute-0 NetworkManager[1035]: <info>  [1759950812.3178] manager: (tap316ecc22-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.320 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap316ecc22-90, col_values=(('external_ids', {'iface-id': '59904509-05c2-48c7-bbf8-0fca2b0d7dd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:13:32 compute-0 ovn_controller[19759]: 2025-10-08T19:13:32Z|00131|binding|INFO|Releasing lport 59904509-05c2-48c7-bbf8-0fca2b0d7dd8 from this chassis (sb_readonly=0)
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.347 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/316ecc22-916e-4a30-bb08-c6bd94993bb1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/316ecc22-916e-4a30-bb08-c6bd94993bb1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.350 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[bb19937d-c438-47ab-914e-29a63cee0078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.351 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-316ecc22-916e-4a30-bb08-c6bd94993bb1
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/316ecc22-916e-4a30-bb08-c6bd94993bb1.pid.haproxy
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID 316ecc22-916e-4a30-bb08-c6bd94993bb1
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:13:32 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:32.351 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'env', 'PROCESS_TAG=haproxy-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/316ecc22-916e-4a30-bb08-c6bd94993bb1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.419 2 DEBUG nova.compute.manager [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.419 2 DEBUG oslo_concurrency.lockutils [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.419 2 DEBUG oslo_concurrency.lockutils [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.419 2 DEBUG oslo_concurrency.lockutils [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:32 compute-0 nova_compute[117514]: 2025-10-08 19:13:32.420 2 DEBUG nova.compute.manager [req-4921b7ed-44da-47d1-906d-185a8d2e332d req-efce0ac2-058c-4cfb-9f3a-2791e7881bc7 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Processing event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:13:32 compute-0 podman[149568]: 2025-10-08 19:13:32.741325445 +0000 UTC m=+0.023324414 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.090 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950813.0901656, f6235bae-08b8-41c2-a187-92e12703dc49 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.091 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] VM Started (Lifecycle Event)#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.095 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.100 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.104 2 INFO nova.virt.libvirt.driver [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance spawned successfully.#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.104 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.124 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.127 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.141 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.142 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.142 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.143 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.144 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.144 2 DEBUG nova.virt.libvirt.driver [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.155 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.155 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950813.0903497, f6235bae-08b8-41c2-a187-92e12703dc49 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.156 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:13:33 compute-0 podman[149568]: 2025-10-08 19:13:33.183319686 +0000 UTC m=+0.465318655 container create 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.200 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.204 2 DEBUG nova.network.neutron [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updated VIF entry in instance network info cache for port b5212a27-711c-427f-af17-227f961acc42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.204 2 DEBUG nova.network.neutron [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.208 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950813.0992303, f6235bae-08b8-41c2-a187-92e12703dc49 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.209 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.217 2 INFO nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Took 6.11 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.218 2 DEBUG nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.225 2 DEBUG oslo_concurrency.lockutils [req-4a0d852f-e4d1-4835-ad03-70c9bbec9e3c req-7f27874f-df74-4e3a-b56f-907055577372 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.234 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.245 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.272 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.301 2 INFO nova.compute.manager [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Took 6.72 seconds to build instance.#033[00m
Oct  8 19:13:33 compute-0 nova_compute[117514]: 2025-10-08 19:13:33.319 2 DEBUG oslo_concurrency.lockutils [None req-ae965c40-8360-48c5-8203-d8a64c9093c5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:33 compute-0 systemd[1]: Started libpod-conmon-194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf.scope.
Oct  8 19:13:33 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:13:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f612388ad5483d09f475d7eb14eac62998c672b7cb73df4597040f3e56b0e25/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:13:33 compute-0 podman[149568]: 2025-10-08 19:13:33.67457525 +0000 UTC m=+0.956574259 container init 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:13:33 compute-0 podman[149568]: 2025-10-08 19:13:33.685304049 +0000 UTC m=+0.967303018 container start 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:13:33 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [NOTICE]   (149628) : New worker (149630) forked
Oct  8 19:13:33 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [NOTICE]   (149628) : Loading success.
Oct  8 19:13:33 compute-0 podman[149582]: 2025-10-08 19:13:33.96920367 +0000 UTC m=+0.725764050 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  8 19:13:33 compute-0 podman[149581]: 2025-10-08 19:13:33.987076816 +0000 UTC m=+0.754529460 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.511 2 DEBUG nova.compute.manager [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.512 2 DEBUG oslo_concurrency.lockutils [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.512 2 DEBUG oslo_concurrency.lockutils [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.512 2 DEBUG oslo_concurrency.lockutils [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.513 2 DEBUG nova.compute.manager [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] No waiting events found dispatching network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:13:34 compute-0 nova_compute[117514]: 2025-10-08 19:13:34.513 2 WARNING nova.compute.manager [req-e925bde7-f0a1-4b83-9c32-f037bfe0304e req-db73e1d3-abc2-448d-b80b-e45936085f43 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received unexpected event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:13:34 compute-0 podman[149640]: 2025-10-08 19:13:34.652658007 +0000 UTC m=+0.070591307 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:13:35 compute-0 nova_compute[117514]: 2025-10-08 19:13:35.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:35 compute-0 NetworkManager[1035]: <info>  [1759950815.7720] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Oct  8 19:13:35 compute-0 NetworkManager[1035]: <info>  [1759950815.7739] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Oct  8 19:13:35 compute-0 ovn_controller[19759]: 2025-10-08T19:13:35Z|00132|binding|INFO|Releasing lport 59904509-05c2-48c7-bbf8-0fca2b0d7dd8 from this chassis (sb_readonly=0)
Oct  8 19:13:35 compute-0 ovn_controller[19759]: 2025-10-08T19:13:35Z|00133|binding|INFO|Releasing lport 59904509-05c2-48c7-bbf8-0fca2b0d7dd8 from this chassis (sb_readonly=0)
Oct  8 19:13:35 compute-0 nova_compute[117514]: 2025-10-08 19:13:35.827 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:35 compute-0 nova_compute[117514]: 2025-10-08 19:13:35.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.396 2 DEBUG nova.compute.manager [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-changed-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.397 2 DEBUG nova.compute.manager [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing instance network info cache due to event network-changed-b5212a27-711c-427f-af17-227f961acc42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.397 2 DEBUG oslo_concurrency.lockutils [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.397 2 DEBUG oslo_concurrency.lockutils [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:13:36 compute-0 nova_compute[117514]: 2025-10-08 19:13:36.397 2 DEBUG nova.network.neutron [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing network info cache for port b5212a27-711c-427f-af17-227f961acc42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:13:37 compute-0 nova_compute[117514]: 2025-10-08 19:13:37.283 2 DEBUG nova.network.neutron [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updated VIF entry in instance network info cache for port b5212a27-711c-427f-af17-227f961acc42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:13:37 compute-0 nova_compute[117514]: 2025-10-08 19:13:37.284 2 DEBUG nova.network.neutron [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:13:37 compute-0 nova_compute[117514]: 2025-10-08 19:13:37.312 2 DEBUG oslo_concurrency.lockutils [req-f3f3cb46-4492-4ec8-a3e9-77e64dd3c90c req-ec017432-9bbd-471b-9cca-71dcd9e5f3a1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:13:38 compute-0 podman[149668]: 2025-10-08 19:13:38.653602867 +0000 UTC m=+0.060019143 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  8 19:13:38 compute-0 podman[149666]: 2025-10-08 19:13:38.685039514 +0000 UTC m=+0.098867733 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:13:38 compute-0 podman[149667]: 2025-10-08 19:13:38.717916822 +0000 UTC m=+0.130565327 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:13:41 compute-0 nova_compute[117514]: 2025-10-08 19:13:41.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:41 compute-0 nova_compute[117514]: 2025-10-08 19:13:41.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:44.233 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:13:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:44.234 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:13:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:13:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:13:46 compute-0 nova_compute[117514]: 2025-10-08 19:13:46.253 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:46 compute-0 nova_compute[117514]: 2025-10-08 19:13:46.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:47 compute-0 ovn_controller[19759]: 2025-10-08T19:13:47Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:82:d1:87 10.100.0.7
Oct  8 19:13:47 compute-0 ovn_controller[19759]: 2025-10-08T19:13:47Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:d1:87 10.100.0.7
Oct  8 19:13:48 compute-0 podman[149745]: 2025-10-08 19:13:48.656670661 +0000 UTC m=+0.065932322 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:13:51 compute-0 nova_compute[117514]: 2025-10-08 19:13:51.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:51 compute-0 nova_compute[117514]: 2025-10-08 19:13:51.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:54 compute-0 nova_compute[117514]: 2025-10-08 19:13:54.888 2 INFO nova.compute.manager [None req-aad24999-986b-479b-b17d-e0ea21edde11 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Get console output#033[00m
Oct  8 19:13:54 compute-0 nova_compute[117514]: 2025-10-08 19:13:54.894 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:13:56 compute-0 nova_compute[117514]: 2025-10-08 19:13:56.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:56 compute-0 nova_compute[117514]: 2025-10-08 19:13:56.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:13:56 compute-0 ovn_controller[19759]: 2025-10-08T19:13:56Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:d1:87 10.100.0.7
Oct  8 19:13:57 compute-0 ovn_controller[19759]: 2025-10-08T19:13:57Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:82:d1:87 10.100.0.7
Oct  8 19:13:58 compute-0 podman[149769]: 2025-10-08 19:13:58.666171822 +0000 UTC m=+0.080726380 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.349 2 DEBUG nova.compute.manager [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-changed-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.349 2 DEBUG nova.compute.manager [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing instance network info cache due to event network-changed-b5212a27-711c-427f-af17-227f961acc42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.349 2 DEBUG oslo_concurrency.lockutils [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.350 2 DEBUG oslo_concurrency.lockutils [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.350 2 DEBUG nova.network.neutron [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Refreshing network info cache for port b5212a27-711c-427f-af17-227f961acc42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.433 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.434 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.434 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.434 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.434 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.435 2 INFO nova.compute.manager [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Terminating instance#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.436 2 DEBUG nova.compute.manager [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:14:00 compute-0 kernel: tapb5212a27-71 (unregistering): left promiscuous mode
Oct  8 19:14:00 compute-0 NetworkManager[1035]: <info>  [1759950840.4686] device (tapb5212a27-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:14:00 compute-0 ovn_controller[19759]: 2025-10-08T19:14:00Z|00134|binding|INFO|Releasing lport b5212a27-711c-427f-af17-227f961acc42 from this chassis (sb_readonly=0)
Oct  8 19:14:00 compute-0 ovn_controller[19759]: 2025-10-08T19:14:00Z|00135|binding|INFO|Setting lport b5212a27-711c-427f-af17-227f961acc42 down in Southbound
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 ovn_controller[19759]: 2025-10-08T19:14:00Z|00136|binding|INFO|Removing iface tapb5212a27-71 ovn-installed in OVS
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.493 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:d1:87 10.100.0.7'], port_security=['fa:16:3e:82:d1:87 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f6235bae-08b8-41c2-a187-92e12703dc49', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2739dafe-af3c-4b39-8e6a-f28bb373aed0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12b6fae3-2fc2-423c-bbb0-b3805950f5b3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=b5212a27-711c-427f-af17-227f961acc42) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.495 28643 INFO neutron.agent.ovn.metadata.agent [-] Port b5212a27-711c-427f-af17-227f961acc42 in datapath 316ecc22-916e-4a30-bb08-c6bd94993bb1 unbound from our chassis#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.497 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 316ecc22-916e-4a30-bb08-c6bd94993bb1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.499 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee279e4-535c-4ba3-bd05-a367027b5b34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.500 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 namespace which is not needed anymore#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Oct  8 19:14:00 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 14.019s CPU time.
Oct  8 19:14:00 compute-0 systemd-machined[77568]: Machine qemu-10-instance-0000000a terminated.
Oct  8 19:14:00 compute-0 kernel: tapb5212a27-71: entered promiscuous mode
Oct  8 19:14:00 compute-0 kernel: tapb5212a27-71 (unregistering): left promiscuous mode
Oct  8 19:14:00 compute-0 NetworkManager[1035]: <info>  [1759950840.6705] manager: (tapb5212a27-71): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [NOTICE]   (149628) : haproxy version is 2.8.14-c23fe91
Oct  8 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [NOTICE]   (149628) : path to executable is /usr/sbin/haproxy
Oct  8 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [WARNING]  (149628) : Exiting Master process...
Oct  8 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [ALERT]    (149628) : Current worker (149630) exited with code 143 (Terminated)
Oct  8 19:14:00 compute-0 neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1[149610]: [WARNING]  (149628) : All workers exited. Exiting... (0)
Oct  8 19:14:00 compute-0 systemd[1]: libpod-194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf.scope: Deactivated successfully.
Oct  8 19:14:00 compute-0 podman[149813]: 2025-10-08 19:14:00.720128451 +0000 UTC m=+0.098053962 container died 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.737 2 INFO nova.virt.libvirt.driver [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance destroyed successfully.#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.738 2 DEBUG nova.objects.instance [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid f6235bae-08b8-41c2-a187-92e12703dc49 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:14:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf-userdata-shm.mount: Deactivated successfully.
Oct  8 19:14:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-3f612388ad5483d09f475d7eb14eac62998c672b7cb73df4597040f3e56b0e25-merged.mount: Deactivated successfully.
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.759 2 DEBUG nova.virt.libvirt.vif [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:13:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-240293585',display_name='tempest-TestNetworkBasicOps-server-240293585',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-240293585',id=10,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB/EJhQ2cVwT1bBhqwqz8VJCILEiuVe01OpwaJWr7LJzSA4TSCURQ/KKnNYCEn/1h4DXNQh6VFPnJP6UNtvndekIhyamyZMFdOa7ELSKKJb75n9Ge1ikETCgbfRbvFVTqw==',key_name='tempest-TestNetworkBasicOps-1252364033',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:13:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-3qhae7ry',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:13:33Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=f6235bae-08b8-41c2-a187-92e12703dc49,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.759 2 DEBUG nova.network.os_vif_util [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.205", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.761 2 DEBUG nova.network.os_vif_util [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.761 2 DEBUG os_vif [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.764 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5212a27-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:00 compute-0 podman[149813]: 2025-10-08 19:14:00.765308364 +0000 UTC m=+0.143233875 container cleanup 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:14:00 compute-0 systemd[1]: libpod-conmon-194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf.scope: Deactivated successfully.
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.775 2 INFO os_vif [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:d1:87,bridge_name='br-int',has_traffic_filtering=True,id=b5212a27-711c-427f-af17-227f961acc42,network=Network(316ecc22-916e-4a30-bb08-c6bd94993bb1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb5212a27-71')#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.776 2 INFO nova.virt.libvirt.driver [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Deleting instance files /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49_del#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.777 2 INFO nova.virt.libvirt.driver [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Deletion of /var/lib/nova/instances/f6235bae-08b8-41c2-a187-92e12703dc49_del complete#033[00m
Oct  8 19:14:00 compute-0 podman[149863]: 2025-10-08 19:14:00.828031874 +0000 UTC m=+0.040795948 container remove 194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.835 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1ded8c70-223a-42a4-a26c-2bc645880d67]: (4, ('Wed Oct  8 07:14:00 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 (194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf)\n194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf\nWed Oct  8 07:14:00 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 (194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf)\n194bb928d5f6ef3def4652af7e3fb08edd40a6cbd6283d5f12dfbb88c236f9cf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.837 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f9102b90-651e-43ac-8779-63df50d9f640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.841 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap316ecc22-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:00 compute-0 kernel: tap316ecc22-90: left promiscuous mode
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.845 2 INFO nova.compute.manager [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.846 2 DEBUG oslo.service.loopingcall [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.846 2 DEBUG nova.compute.manager [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.847 2 DEBUG nova.network.neutron [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 nova_compute[117514]: 2025-10-08 19:14:00.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.870 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca7dfcd-5700-4a1b-b443-b90f994cde7b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.897 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[0af635a4-838d-4667-9a68-d07228dc2f8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.900 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[47ebcc4f-e523-40bb-aec7-b38d78222353]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.923 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[d34d4e8d-e83c-4121-af93-37012dbac095]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 148262, 'reachable_time': 38978, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 149878, 'error': None, 'target': 'ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:00 compute-0 systemd[1]: run-netns-ovnmeta\x2d316ecc22\x2d916e\x2d4a30\x2dbb08\x2dc6bd94993bb1.mount: Deactivated successfully.
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.925 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-316ecc22-916e-4a30-bb08-c6bd94993bb1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:14:00 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:00.926 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[334101b8-2385-46ba-a202-cd32cacbaa65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:01 compute-0 nova_compute[117514]: 2025-10-08 19:14:01.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.532 2 DEBUG nova.compute.manager [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-unplugged-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.532 2 DEBUG oslo_concurrency.lockutils [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.533 2 DEBUG oslo_concurrency.lockutils [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.533 2 DEBUG oslo_concurrency.lockutils [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.533 2 DEBUG nova.compute.manager [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] No waiting events found dispatching network-vif-unplugged-b5212a27-711c-427f-af17-227f961acc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:14:02 compute-0 nova_compute[117514]: 2025-10-08 19:14:02.534 2 DEBUG nova.compute.manager [req-934b2f61-9e22-422d-950f-1d2870467692 req-ceaf5385-a2c1-4505-ab7b-1798a0a6ade2 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-unplugged-b5212a27-711c-427f-af17-227f961acc42 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 19:14:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:03.181 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:14:03 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:03.182 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.386 2 DEBUG nova.network.neutron [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.408 2 INFO nova.compute.manager [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Took 2.56 seconds to deallocate network for instance.#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.474 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.475 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.541 2 DEBUG nova.compute.provider_tree [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.559 2 DEBUG nova.scheduler.client.report [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.580 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.609 2 INFO nova.scheduler.client.report [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance f6235bae-08b8-41c2-a187-92e12703dc49#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.647 2 DEBUG nova.network.neutron [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updated VIF entry in instance network info cache for port b5212a27-711c-427f-af17-227f961acc42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.648 2 DEBUG nova.network.neutron [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Updating instance_info_cache with network_info: [{"id": "b5212a27-711c-427f-af17-227f961acc42", "address": "fa:16:3e:82:d1:87", "network": {"id": "316ecc22-916e-4a30-bb08-c6bd94993bb1", "bridge": "br-int", "label": "tempest-network-smoke--1739360415", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb5212a27-71", "ovs_interfaceid": "b5212a27-711c-427f-af17-227f961acc42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.669 2 DEBUG oslo_concurrency.lockutils [None req-fe12152d-c787-45a2-9941-5e205e077fcc efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:03 compute-0 nova_compute[117514]: 2025-10-08 19:14:03.670 2 DEBUG oslo_concurrency.lockutils [req-cdbb648e-890e-421b-9045-dc67e216ff53 req-d556c957-4c99-4c3d-a99f-5e8773a72029 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-f6235bae-08b8-41c2-a187-92e12703dc49" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.611 2 DEBUG nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.612 2 DEBUG oslo_concurrency.lockutils [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.612 2 DEBUG oslo_concurrency.lockutils [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.612 2 DEBUG oslo_concurrency.lockutils [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "f6235bae-08b8-41c2-a187-92e12703dc49-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.612 2 DEBUG nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] No waiting events found dispatching network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.613 2 WARNING nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received unexpected event network-vif-plugged-b5212a27-711c-427f-af17-227f961acc42 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.613 2 DEBUG nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Received event network-vif-deleted-b5212a27-711c-427f-af17-227f961acc42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.613 2 INFO nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Neutron deleted interface b5212a27-711c-427f-af17-227f961acc42; detaching it from the instance and deleting it from the info cache#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.613 2 DEBUG nova.network.neutron [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  8 19:14:04 compute-0 nova_compute[117514]: 2025-10-08 19:14:04.616 2 DEBUG nova.compute.manager [req-c2eaa8ad-edd9-4f6a-af4d-edb09245cff8 req-82246bf3-239c-4518-8750-c21451ebc133 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Detach interface failed, port_id=b5212a27-711c-427f-af17-227f961acc42, reason: Instance f6235bae-08b8-41c2-a187-92e12703dc49 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  8 19:14:04 compute-0 podman[149879]: 2025-10-08 19:14:04.695744999 +0000 UTC m=+0.098188764 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc.)
Oct  8 19:14:04 compute-0 podman[149880]: 2025-10-08 19:14:04.6985504 +0000 UTC m=+0.095444705 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Oct  8 19:14:04 compute-0 podman[149918]: 2025-10-08 19:14:04.800208702 +0000 UTC m=+0.070613678 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:14:05 compute-0 nova_compute[117514]: 2025-10-08 19:14:05.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:05 compute-0 nova_compute[117514]: 2025-10-08 19:14:05.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:06 compute-0 nova_compute[117514]: 2025-10-08 19:14:06.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:07 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:07.185 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:07 compute-0 nova_compute[117514]: 2025-10-08 19:14:07.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:07 compute-0 nova_compute[117514]: 2025-10-08 19:14:07.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:14:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.741 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.741 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.742 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.742 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:14:08 compute-0 podman[149944]: 2025-10-08 19:14:08.862110411 +0000 UTC m=+0.070140104 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.license=GPLv2)
Oct  8 19:14:08 compute-0 podman[149946]: 2025-10-08 19:14:08.868454514 +0000 UTC m=+0.065197592 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Oct  8 19:14:08 compute-0 podman[149945]: 2025-10-08 19:14:08.95498497 +0000 UTC m=+0.145019224 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.959 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.961 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6099MB free_disk=73.4137954711914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.961 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:08 compute-0 nova_compute[117514]: 2025-10-08 19:14:08.962 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.034 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.034 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.059 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.079 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.109 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:14:09 compute-0 nova_compute[117514]: 2025-10-08 19:14:09.109 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:10 compute-0 nova_compute[117514]: 2025-10-08 19:14:10.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.104 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.104 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.105 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.105 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.122 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.122 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.123 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:11 compute-0 nova_compute[117514]: 2025-10-08 19:14:11.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:12 compute-0 nova_compute[117514]: 2025-10-08 19:14:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:14 compute-0 nova_compute[117514]: 2025-10-08 19:14:14.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.719 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.735 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950840.7344065, f6235bae-08b8-41c2-a187-92e12703dc49 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.736 2 INFO nova.compute.manager [-] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.757 2 DEBUG nova.compute.manager [None req-529c4b94-2475-4d8f-afb3-35098bdbf50e - - - - - -] [instance: f6235bae-08b8-41c2-a187-92e12703dc49] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:15 compute-0 nova_compute[117514]: 2025-10-08 19:14:15.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:16 compute-0 nova_compute[117514]: 2025-10-08 19:14:16.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:19 compute-0 podman[150006]: 2025-10-08 19:14:19.658155254 +0000 UTC m=+0.069623760 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 19:14:20 compute-0 nova_compute[117514]: 2025-10-08 19:14:20.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:21 compute-0 nova_compute[117514]: 2025-10-08 19:14:21.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.495 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.496 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.515 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.607 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.608 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.621 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.621 2 INFO nova.compute.claims [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.733 2 DEBUG nova.compute.provider_tree [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.747 2 DEBUG nova.scheduler.client.report [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.767 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.768 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.811 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.811 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.832 2 INFO nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.849 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.943 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.946 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.947 2 INFO nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Creating image(s)#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.948 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.949 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.950 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:25 compute-0 nova_compute[117514]: 2025-10-08 19:14:25.975 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.037 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.039 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.040 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.066 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.128 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.129 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.172 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.173 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.174 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.214 2 DEBUG nova.policy [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.236 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.237 2 DEBUG nova.virt.disk.api [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.238 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.300 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.301 2 DEBUG nova.virt.disk.api [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.302 2 DEBUG nova.objects.instance [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 5e004931-f1db-408c-9f7a-6c6c50c5f8ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.317 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.317 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Ensure instance console log exists: /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.318 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.319 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.319 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:26 compute-0 nova_compute[117514]: 2025-10-08 19:14:26.830 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Successfully created port: ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.699 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Successfully updated port: ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.714 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.715 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.716 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.786 2 DEBUG nova.compute.manager [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.787 2 DEBUG nova.compute.manager [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.787 2 DEBUG oslo_concurrency.lockutils [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:14:27 compute-0 nova_compute[117514]: 2025-10-08 19:14:27.861 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.640 2 DEBUG nova.network.neutron [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.658 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.659 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance network_info: |[{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.660 2 DEBUG oslo_concurrency.lockutils [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.660 2 DEBUG nova.network.neutron [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.664 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start _get_guest_xml network_info=[{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.671 2 WARNING nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.679 2 DEBUG nova.virt.libvirt.host [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.680 2 DEBUG nova.virt.libvirt.host [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.684 2 DEBUG nova.virt.libvirt.host [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.685 2 DEBUG nova.virt.libvirt.host [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.686 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.686 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.686 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.687 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.687 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.687 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.688 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.689 2 DEBUG nova.virt.hardware [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.692 2 DEBUG nova.virt.libvirt.vif [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-103133275',display_name='tempest-TestNetworkBasicOps-server-103133275',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-103133275',id=11,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5y6d80fHySET4pCbLeqyj0cyDTZn6hTOGziG7pCiD92qFDw7Uq+y0suIKpGvDK2QOm6VBv2vJI5Io6WjjxteICCSlzmOgxu+CdOrYx2YA1B+bI4ndO5c+cp00qcb4ncw==',key_name='tempest-TestNetworkBasicOps-286586540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-cyi34c6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:14:25Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5e004931-f1db-408c-9f7a-6c6c50c5f8ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.693 2 DEBUG nova.network.os_vif_util [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.694 2 DEBUG nova.network.os_vif_util [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.695 2 DEBUG nova.objects.instance [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 5e004931-f1db-408c-9f7a-6c6c50c5f8ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.711 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <uuid>5e004931-f1db-408c-9f7a-6c6c50c5f8ef</uuid>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <name>instance-0000000b</name>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-103133275</nova:name>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:14:28</nova:creationTime>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:14:28 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:14:28 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:14:28 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:14:28 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:14:28 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:14:28 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:14:28 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:14:28 compute-0 nova_compute[117514]:        <nova:port uuid="ae9e7968-10b0-4606-9fa3-c91374cf1cc1">
Oct  8 19:14:28 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <system>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <entry name="serial">5e004931-f1db-408c-9f7a-6c6c50c5f8ef</entry>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <entry name="uuid">5e004931-f1db-408c-9f7a-6c6c50c5f8ef</entry>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </system>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <os>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  </os>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <features>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  </features>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.config"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:23:50:87"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <target dev="tapae9e7968-10"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/console.log" append="off"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <video>
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </video>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:14:28 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:14:28 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:14:28 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:14:28 compute-0 nova_compute[117514]: </domain>
Oct  8 19:14:28 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.712 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Preparing to wait for external event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.713 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.714 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.714 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.716 2 DEBUG nova.virt.libvirt.vif [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-103133275',display_name='tempest-TestNetworkBasicOps-server-103133275',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-103133275',id=11,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5y6d80fHySET4pCbLeqyj0cyDTZn6hTOGziG7pCiD92qFDw7Uq+y0suIKpGvDK2QOm6VBv2vJI5Io6WjjxteICCSlzmOgxu+CdOrYx2YA1B+bI4ndO5c+cp00qcb4ncw==',key_name='tempest-TestNetworkBasicOps-286586540',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-cyi34c6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:14:25Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5e004931-f1db-408c-9f7a-6c6c50c5f8ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.717 2 DEBUG nova.network.os_vif_util [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.718 2 DEBUG nova.network.os_vif_util [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.719 2 DEBUG os_vif [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.721 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.722 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.728 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapae9e7968-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.729 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapae9e7968-10, col_values=(('external_ids', {'iface-id': 'ae9e7968-10b0-4606-9fa3-c91374cf1cc1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:50:87', 'vm-uuid': '5e004931-f1db-408c-9f7a-6c6c50c5f8ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:28 compute-0 NetworkManager[1035]: <info>  [1759950868.7333] manager: (tapae9e7968-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.740 2 INFO os_vif [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10')#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.784 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.784 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.785 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:23:50:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:14:28 compute-0 nova_compute[117514]: 2025-10-08 19:14:28.785 2 INFO nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Using config drive#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.197 2 INFO nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Creating config drive at /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.config#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.206 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqpekn68q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.347 2 DEBUG oslo_concurrency.processutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqpekn68q" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:29 compute-0 kernel: tapae9e7968-10: entered promiscuous mode
Oct  8 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.4416] manager: (tapae9e7968-10): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Oct  8 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00137|binding|INFO|Claiming lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for this chassis.
Oct  8 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00138|binding|INFO|ae9e7968-10b0-4606-9fa3-c91374cf1cc1: Claiming fa:16:3e:23:50:87 10.100.0.4
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.496 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:50:87 10.100.0.4'], port_security=['fa:16:3e:23:50:87 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5e004931-f1db-408c-9f7a-6c6c50c5f8ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c3b607ea-9253-4328-bb00-668338c7a25d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=770536b4-68ae-4751-9b56-96d89b6bc561, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=ae9e7968-10b0-4606-9fa3-c91374cf1cc1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.497 28643 INFO neutron.agent.ovn.metadata.agent [-] Port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 in datapath 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac bound to our chassis#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.498 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.512 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[dab5c0f1-1002-4339-939b-ddca8bb83dca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.513 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6826b0cb-71 in ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.515 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6826b0cb-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.515 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[14f75afb-2221-4e58-a3db-8ab059fe2470]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.516 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[da5f1852-8d17-401e-b67c-00ac3a9911f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 systemd-udevd[150082]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:14:29 compute-0 systemd-machined[77568]: New machine qemu-11-instance-0000000b.
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.542 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[09b5fc4d-b421-4ccc-9c0b-a9e3840240db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.5450] device (tapae9e7968-10): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.5460] device (tapae9e7968-10): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:29 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Oct  8 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00139|binding|INFO|Setting lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 ovn-installed in OVS
Oct  8 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00140|binding|INFO|Setting lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 up in Southbound
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.574 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[7ee601ff-344e-4454-9513-b931fac9ae4e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 podman[150058]: 2025-10-08 19:14:29.580693445 +0000 UTC m=+0.131259628 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.605 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b340e8-a958-4497-b61d-af2518421ff2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.609 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[54f2a5a3-3411-408d-a72b-bf5ee7f3a517]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 systemd-udevd[150086]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.6108] manager: (tap6826b0cb-70): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.638 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d28d90df-e434-4487-9d07-4cc62bb62fe1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.645 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[e55ccb9c-8f81-4613-a552-f83ce4ce85fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.6663] device (tap6826b0cb-70): carrier: link connected
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.671 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[149c8ca2-4f37-4dfd-a781-1c449097c71b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.686 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[adec9253-0955-4665-a55d-1a31fde7862a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6826b0cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:04:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154024, 'reachable_time': 41558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150117, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.700 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[5485fb2a-5f24-4372-98c0-96b3f85d91bd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:42d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154024, 'tstamp': 154024}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150118, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.721 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca37da0-3434-470b-b8cc-633f448d0cb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6826b0cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:04:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154024, 'reachable_time': 41558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 150119, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.769 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aabd2cbe-15b0-4e12-ad84-36057171c925]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.859 2 DEBUG nova.network.neutron [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.860 2 DEBUG nova.network.neutron [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.861 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[327bf00e-8458-4923-be66-682d3a7dd979]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.864 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6826b0cb-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.864 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.865 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6826b0cb-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:29 compute-0 kernel: tap6826b0cb-70: entered promiscuous mode
Oct  8 19:14:29 compute-0 NetworkManager[1035]: <info>  [1759950869.8679] manager: (tap6826b0cb-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.870 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6826b0cb-70, col_values=(('external_ids', {'iface-id': 'eabc4672-d176-4f11-b5f6-bcbea840c3e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.872 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6826b0cb-7eaf-4468-bf17-e3c581bfc4ac.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6826b0cb-7eaf-4468-bf17-e3c581bfc4ac.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:14:29 compute-0 ovn_controller[19759]: 2025-10-08T19:14:29Z|00141|binding|INFO|Releasing lport eabc4672-d176-4f11-b5f6-bcbea840c3e8 from this chassis (sb_readonly=0)
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.875 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[94fe8f8e-8ba8-44b3-a65c-9969f3f13ab2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.876 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/6826b0cb-7eaf-4468-bf17-e3c581bfc4ac.pid.haproxy
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:14:29 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:29.878 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'env', 'PROCESS_TAG=haproxy-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6826b0cb-7eaf-4468-bf17-e3c581bfc4ac.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.893 2 DEBUG nova.compute.manager [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.893 2 DEBUG oslo_concurrency.lockutils [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.895 2 DEBUG oslo_concurrency.lockutils [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.895 2 DEBUG oslo_concurrency.lockutils [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.896 2 DEBUG nova.compute.manager [req-6a1a61a3-502c-4ac5-bcc4-e648c7210557 req-ee88695f-33ad-481f-bc70-bbbc14834775 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Processing event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:14:29 compute-0 nova_compute[117514]: 2025-10-08 19:14:29.898 2 DEBUG oslo_concurrency.lockutils [req-36472180-8aae-481f-99e9-324f3cea6893 req-31a141d6-f2b5-494e-b7e4-e116f2a4eb4d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:14:30 compute-0 podman[150158]: 2025-10-08 19:14:30.328141819 +0000 UTC m=+0.078371382 container create fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 19:14:30 compute-0 systemd[1]: Started libpod-conmon-fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b.scope.
Oct  8 19:14:30 compute-0 podman[150158]: 2025-10-08 19:14:30.28763491 +0000 UTC m=+0.037864564 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:14:30 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:14:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6837e9d7ec807fa0b782d57371b150b504d3cac3a36af379f52c92234713c15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:14:30 compute-0 podman[150158]: 2025-10-08 19:14:30.421152003 +0000 UTC m=+0.171381596 container init fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:14:30 compute-0 podman[150158]: 2025-10-08 19:14:30.431085479 +0000 UTC m=+0.181315042 container start fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  8 19:14:30 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [NOTICE]   (150177) : New worker (150179) forked
Oct  8 19:14:30 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [NOTICE]   (150177) : Loading success.
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.498 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.499 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950870.4977334, 5e004931-f1db-408c-9f7a-6c6c50c5f8ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.500 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] VM Started (Lifecycle Event)#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.504 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.508 2 INFO nova.virt.libvirt.driver [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance spawned successfully.#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.509 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.536 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.536 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.537 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.537 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.537 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.538 2 DEBUG nova.virt.libvirt.driver [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.542 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.545 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.578 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.578 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950870.49909, 5e004931-f1db-408c-9f7a-6c6c50c5f8ef => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.578 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.608 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.611 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950870.502619, 5e004931-f1db-408c-9f7a-6c6c50c5f8ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.611 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.615 2 INFO nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Took 4.67 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.616 2 DEBUG nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.638 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.641 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.665 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.679 2 INFO nova.compute.manager [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Took 5.10 seconds to build instance.#033[00m
Oct  8 19:14:30 compute-0 nova_compute[117514]: 2025-10-08 19:14:30.693 2 DEBUG oslo_concurrency.lockutils [None req-3a474fbf-257d-4122-9fc5-ffd4bffd6857 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.990 2 DEBUG nova.compute.manager [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.991 2 DEBUG oslo_concurrency.lockutils [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.991 2 DEBUG oslo_concurrency.lockutils [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.991 2 DEBUG oslo_concurrency.lockutils [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.992 2 DEBUG nova.compute.manager [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:14:31 compute-0 nova_compute[117514]: 2025-10-08 19:14:31.992 2 WARNING nova.compute.manager [req-45fa6cb2-377a-403e-98d9-b23aae8f33e8 req-345f1e34-1ccc-4ec6-b238-955c274fdfd6 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:14:33 compute-0 nova_compute[117514]: 2025-10-08 19:14:33.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:35 compute-0 podman[150188]: 2025-10-08 19:14:35.684811102 +0000 UTC m=+0.093136458 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_id=edpm, name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Oct  8 19:14:35 compute-0 podman[150190]: 2025-10-08 19:14:35.701754041 +0000 UTC m=+0.104986850 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:14:35 compute-0 podman[150189]: 2025-10-08 19:14:35.708146925 +0000 UTC m=+0.113322990 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:14:35 compute-0 nova_compute[117514]: 2025-10-08 19:14:35.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:35 compute-0 ovn_controller[19759]: 2025-10-08T19:14:35Z|00142|binding|INFO|Releasing lport eabc4672-d176-4f11-b5f6-bcbea840c3e8 from this chassis (sb_readonly=0)
Oct  8 19:14:35 compute-0 NetworkManager[1035]: <info>  [1759950875.7369] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Oct  8 19:14:35 compute-0 NetworkManager[1035]: <info>  [1759950875.7379] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct  8 19:14:35 compute-0 ovn_controller[19759]: 2025-10-08T19:14:35Z|00143|binding|INFO|Releasing lport eabc4672-d176-4f11-b5f6-bcbea840c3e8 from this chassis (sb_readonly=0)
Oct  8 19:14:35 compute-0 nova_compute[117514]: 2025-10-08 19:14:35.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:35.999 2 DEBUG nova.compute.manager [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.001 2 DEBUG nova.compute.manager [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.002 2 DEBUG oslo_concurrency.lockutils [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.003 2 DEBUG oslo_concurrency.lockutils [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.004 2 DEBUG nova.network.neutron [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:14:36 compute-0 nova_compute[117514]: 2025-10-08 19:14:36.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:37 compute-0 nova_compute[117514]: 2025-10-08 19:14:37.231 2 DEBUG nova.network.neutron [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:14:37 compute-0 nova_compute[117514]: 2025-10-08 19:14:37.233 2 DEBUG nova.network.neutron [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:14:37 compute-0 nova_compute[117514]: 2025-10-08 19:14:37.251 2 DEBUG oslo_concurrency.lockutils [req-e7b03236-9547-43aa-b203-494b406bfe28 req-fa264db8-af84-44e3-97b4-221c36542ef4 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:14:38 compute-0 nova_compute[117514]: 2025-10-08 19:14:38.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:39 compute-0 podman[150256]: 2025-10-08 19:14:39.644040398 +0000 UTC m=+0.055340028 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  8 19:14:39 compute-0 podman[150254]: 2025-10-08 19:14:39.655226771 +0000 UTC m=+0.068412425 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  8 19:14:39 compute-0 podman[150255]: 2025-10-08 19:14:39.696561183 +0000 UTC m=+0.109719696 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.487 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.488 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.506 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.606 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.608 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.618 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.618 2 INFO nova.compute.claims [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.732 2 DEBUG nova.compute.provider_tree [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.747 2 DEBUG nova.scheduler.client.report [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.768 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.769 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.817 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.818 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.838 2 INFO nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.864 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.956 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.958 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.958 2 INFO nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Creating image(s)#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.959 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.960 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.961 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:41 compute-0 nova_compute[117514]: 2025-10-08 19:14:41.984 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.077 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.078 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.079 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.112 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.209 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.097s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.210 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.243 2 DEBUG nova.policy [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.263 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.264 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.265 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.335 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.336 2 DEBUG nova.virt.disk.api [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.337 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.403 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.404 2 DEBUG nova.virt.disk.api [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.405 2 DEBUG nova.objects.instance [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 2cd8a1e0-1eff-4f72-b839-340a50f3f21c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.421 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.422 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Ensure instance console log exists: /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.422 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.423 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.424 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:42 compute-0 ovn_controller[19759]: 2025-10-08T19:14:42Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:23:50:87 10.100.0.4
Oct  8 19:14:42 compute-0 ovn_controller[19759]: 2025-10-08T19:14:42Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:23:50:87 10.100.0.4
Oct  8 19:14:42 compute-0 nova_compute[117514]: 2025-10-08 19:14:42.810 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Successfully created port: 2139e839-c698-494f-9fbc-5605baef1d1d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.879 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Successfully updated port: 2139e839-c698-494f-9fbc-5605baef1d1d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.895 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.896 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.896 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.976 2 DEBUG nova.compute.manager [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.976 2 DEBUG nova.compute.manager [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing instance network info cache due to event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:14:43 compute-0 nova_compute[117514]: 2025-10-08 19:14:43.977 2 DEBUG oslo_concurrency.lockutils [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.191 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:14:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:44.234 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.856 2 DEBUG nova.network.neutron [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.881 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.881 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance network_info: |[{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.882 2 DEBUG oslo_concurrency.lockutils [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.882 2 DEBUG nova.network.neutron [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.885 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start _get_guest_xml network_info=[{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.890 2 WARNING nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.895 2 DEBUG nova.virt.libvirt.host [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.895 2 DEBUG nova.virt.libvirt.host [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.899 2 DEBUG nova.virt.libvirt.host [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.899 2 DEBUG nova.virt.libvirt.host [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.900 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.900 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.901 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.902 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.902 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.902 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.902 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.903 2 DEBUG nova.virt.hardware [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.906 2 DEBUG nova.virt.libvirt.vif [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104709800',display_name='tempest-TestNetworkBasicOps-server-2104709800',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104709800',id=12,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7KRl2SW48tLsDGtdUZXstQI0RJAgkIMeGypW4KhorPNM5dX0aheM9ROODmr544NnSbnVhZPkTpmB3kqR7fi9vzFVS1BaUwNIB2s1Cu3kNzwW4pHA+avxmDokcR+QqgSQ==',key_name='tempest-TestNetworkBasicOps-1494317570',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-vtc0uukp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:14:41Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=2cd8a1e0-1eff-4f72-b839-340a50f3f21c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.906 2 DEBUG nova.network.os_vif_util [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.907 2 DEBUG nova.network.os_vif_util [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.908 2 DEBUG nova.objects.instance [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2cd8a1e0-1eff-4f72-b839-340a50f3f21c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.923 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <uuid>2cd8a1e0-1eff-4f72-b839-340a50f3f21c</uuid>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <name>instance-0000000c</name>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-2104709800</nova:name>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:14:44</nova:creationTime>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:14:44 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:14:44 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:14:44 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:14:44 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:14:44 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:14:44 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:14:44 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:14:44 compute-0 nova_compute[117514]:        <nova:port uuid="2139e839-c698-494f-9fbc-5605baef1d1d">
Oct  8 19:14:44 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <system>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <entry name="serial">2cd8a1e0-1eff-4f72-b839-340a50f3f21c</entry>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <entry name="uuid">2cd8a1e0-1eff-4f72-b839-340a50f3f21c</entry>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </system>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <os>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  </os>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <features>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  </features>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.config"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:22:30:5a"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <target dev="tap2139e839-c6"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/console.log" append="off"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <video>
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </video>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:14:44 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:14:44 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:14:44 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:14:44 compute-0 nova_compute[117514]: </domain>
Oct  8 19:14:44 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.924 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Preparing to wait for external event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.924 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.924 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.924 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.925 2 DEBUG nova.virt.libvirt.vif [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104709800',display_name='tempest-TestNetworkBasicOps-server-2104709800',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104709800',id=12,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7KRl2SW48tLsDGtdUZXstQI0RJAgkIMeGypW4KhorPNM5dX0aheM9ROODmr544NnSbnVhZPkTpmB3kqR7fi9vzFVS1BaUwNIB2s1Cu3kNzwW4pHA+avxmDokcR+QqgSQ==',key_name='tempest-TestNetworkBasicOps-1494317570',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-vtc0uukp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:14:41Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=2cd8a1e0-1eff-4f72-b839-340a50f3f21c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.925 2 DEBUG nova.network.os_vif_util [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.926 2 DEBUG nova.network.os_vif_util [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.926 2 DEBUG os_vif [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.927 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.929 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2139e839-c6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.929 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2139e839-c6, col_values=(('external_ids', {'iface-id': '2139e839-c698-494f-9fbc-5605baef1d1d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:22:30:5a', 'vm-uuid': '2cd8a1e0-1eff-4f72-b839-340a50f3f21c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:44 compute-0 NetworkManager[1035]: <info>  [1759950884.9327] manager: (tap2139e839-c6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:44 compute-0 nova_compute[117514]: 2025-10-08 19:14:44.939 2 INFO os_vif [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6')#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.014 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.015 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.016 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:22:30:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.016 2 INFO nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Using config drive#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.358 2 INFO nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Creating config drive at /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.config#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.363 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl8utjb96 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.505 2 DEBUG oslo_concurrency.processutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpl8utjb96" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:14:45 compute-0 kernel: tap2139e839-c6: entered promiscuous mode
Oct  8 19:14:45 compute-0 NetworkManager[1035]: <info>  [1759950885.5746] manager: (tap2139e839-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Oct  8 19:14:45 compute-0 ovn_controller[19759]: 2025-10-08T19:14:45Z|00144|binding|INFO|Claiming lport 2139e839-c698-494f-9fbc-5605baef1d1d for this chassis.
Oct  8 19:14:45 compute-0 ovn_controller[19759]: 2025-10-08T19:14:45Z|00145|binding|INFO|2139e839-c698-494f-9fbc-5605baef1d1d: Claiming fa:16:3e:22:30:5a 10.100.0.6
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.586 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:30:5a 10.100.0.6'], port_security=['fa:16:3e:22:30:5a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2cd8a1e0-1eff-4f72-b839-340a50f3f21c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9bd895d2-82c4-4fc5-81d5-e70c0a9516c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=770536b4-68ae-4751-9b56-96d89b6bc561, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=2139e839-c698-494f-9fbc-5605baef1d1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.588 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 2139e839-c698-494f-9fbc-5605baef1d1d in datapath 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac bound to our chassis#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.589 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac#033[00m
Oct  8 19:14:45 compute-0 ovn_controller[19759]: 2025-10-08T19:14:45Z|00146|binding|INFO|Setting lport 2139e839-c698-494f-9fbc-5605baef1d1d ovn-installed in OVS
Oct  8 19:14:45 compute-0 ovn_controller[19759]: 2025-10-08T19:14:45Z|00147|binding|INFO|Setting lport 2139e839-c698-494f-9fbc-5605baef1d1d up in Southbound
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:45 compute-0 systemd-udevd[150361]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.609 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[70481a22-acd0-46fa-a39f-db00df185299]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:45 compute-0 systemd-machined[77568]: New machine qemu-12-instance-0000000c.
Oct  8 19:14:45 compute-0 NetworkManager[1035]: <info>  [1759950885.6233] device (tap2139e839-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:14:45 compute-0 NetworkManager[1035]: <info>  [1759950885.6243] device (tap2139e839-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:14:45 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.642 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[fd2405d4-577c-40e0-9183-b64d5f8f3ad5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.646 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[376cb553-ccc9-4e12-b8a2-ca82faad77ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.674 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[9e7c5f8f-3e13-4866-a9e8-4b3f7fd179d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.697 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[17a256db-7ef0-42ec-8892-e09adeaefef8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6826b0cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:04:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 6, 'rx_bytes': 832, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154024, 'reachable_time': 41558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150372, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.721 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[1b73410e-8f5c-4a59-a9b9-69ddff97413f]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6826b0cb-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154039, 'tstamp': 154039}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150375, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6826b0cb-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154043, 'tstamp': 154043}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150375, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.723 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6826b0cb-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.726 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6826b0cb-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.726 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.727 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6826b0cb-70, col_values=(('external_ids', {'iface-id': 'eabc4672-d176-4f11-b5f6-bcbea840c3e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:14:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:14:45.727 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.764 2 DEBUG nova.compute.manager [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.765 2 DEBUG oslo_concurrency.lockutils [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.769 2 DEBUG oslo_concurrency.lockutils [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.769 2 DEBUG oslo_concurrency.lockutils [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.770 2 DEBUG nova.compute.manager [req-69e4a6b9-494e-4bac-82e5-a2e121a5386f req-2d30ef81-3f22-4512-b0bd-dae1e7f41322 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Processing event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.972 2 DEBUG nova.network.neutron [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updated VIF entry in instance network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.972 2 DEBUG nova.network.neutron [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:14:45 compute-0 nova_compute[117514]: 2025-10-08 19:14:45.991 2 DEBUG oslo_concurrency.lockutils [req-a5cff1f8-d980-48ea-80eb-6186ce0832e7 req-7f55d453-1e72-4d52-8451-b897f4dff90a bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.555 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950886.554685, 2cd8a1e0-1eff-4f72-b839-340a50f3f21c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.555 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] VM Started (Lifecycle Event)#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.557 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.560 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.563 2 INFO nova.virt.libvirt.driver [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance spawned successfully.#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.564 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.580 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.586 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.590 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.591 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.591 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.592 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.592 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.593 2 DEBUG nova.virt.libvirt.driver [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.618 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.619 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950886.5553448, 2cd8a1e0-1eff-4f72-b839-340a50f3f21c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.619 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.642 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.646 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950886.5599563, 2cd8a1e0-1eff-4f72-b839-340a50f3f21c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.646 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.650 2 INFO nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Took 4.69 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.650 2 DEBUG nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.663 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.667 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.695 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.713 2 INFO nova.compute.manager [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Took 5.15 seconds to build instance.#033[00m
Oct  8 19:14:46 compute-0 nova_compute[117514]: 2025-10-08 19:14:46.728 2 DEBUG oslo_concurrency.lockutils [None req-412f3982-12c8-43cc-9f3b-ea0a1598a837 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.844 2 DEBUG nova.compute.manager [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.845 2 DEBUG oslo_concurrency.lockutils [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.845 2 DEBUG oslo_concurrency.lockutils [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.846 2 DEBUG oslo_concurrency.lockutils [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.846 2 DEBUG nova.compute.manager [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] No waiting events found dispatching network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:14:47 compute-0 nova_compute[117514]: 2025-10-08 19:14:47.847 2 WARNING nova.compute.manager [req-9aa6218b-debe-40b6-85d1-98bffc582343 req-83c6c786-0aba-4b25-8d15-8815e381e64e bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received unexpected event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d for instance with vm_state active and task_state None.#033[00m
Oct  8 19:14:49 compute-0 nova_compute[117514]: 2025-10-08 19:14:49.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:50 compute-0 podman[150384]: 2025-10-08 19:14:50.672219968 +0000 UTC m=+0.082401759 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:14:51 compute-0 nova_compute[117514]: 2025-10-08 19:14:51.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.283 2 DEBUG nova.compute.manager [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.283 2 DEBUG nova.compute.manager [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing instance network info cache due to event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.284 2 DEBUG oslo_concurrency.lockutils [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.284 2 DEBUG oslo_concurrency.lockutils [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.285 2 DEBUG nova.network.neutron [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:14:54 compute-0 nova_compute[117514]: 2025-10-08 19:14:54.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:56 compute-0 nova_compute[117514]: 2025-10-08 19:14:56.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:14:56 compute-0 nova_compute[117514]: 2025-10-08 19:14:56.525 2 DEBUG nova.network.neutron [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updated VIF entry in instance network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:14:56 compute-0 nova_compute[117514]: 2025-10-08 19:14:56.526 2 DEBUG nova.network.neutron [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [{"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:14:56 compute-0 nova_compute[117514]: 2025-10-08 19:14:56.549 2 DEBUG oslo_concurrency.lockutils [req-127840b5-84fb-40f7-9219-e6c1e1f035b0 req-6ccc32d5-16cf-48fa-96e7-ef506b9537de bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:14:58 compute-0 ovn_controller[19759]: 2025-10-08T19:14:58Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:22:30:5a 10.100.0.6
Oct  8 19:14:58 compute-0 ovn_controller[19759]: 2025-10-08T19:14:58Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:22:30:5a 10.100.0.6
Oct  8 19:14:59 compute-0 nova_compute[117514]: 2025-10-08 19:14:59.936 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:00 compute-0 podman[150424]: 2025-10-08 19:15:00.646970216 +0000 UTC m=+0.064794601 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  8 19:15:01 compute-0 nova_compute[117514]: 2025-10-08 19:15:01.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:04 compute-0 nova_compute[117514]: 2025-10-08 19:15:04.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:06 compute-0 nova_compute[117514]: 2025-10-08 19:15:06.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:06 compute-0 podman[150446]: 2025-10-08 19:15:06.660704661 +0000 UTC m=+0.068640230 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 19:15:06 compute-0 podman[150445]: 2025-10-08 19:15:06.661845744 +0000 UTC m=+0.081528221 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 19:15:06 compute-0 podman[150444]: 2025-10-08 19:15:06.670197675 +0000 UTC m=+0.089265405 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  8 19:15:07 compute-0 nova_compute[117514]: 2025-10-08 19:15:07.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.752 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.753 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.753 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.753 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.837 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:09 compute-0 podman[150512]: 2025-10-08 19:15:09.892189396 +0000 UTC m=+0.076619891 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.914 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.915 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:09 compute-0 podman[150516]: 2025-10-08 19:15:09.920642496 +0000 UTC m=+0.083319943 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:09 compute-0 podman[150513]: 2025-10-08 19:15:09.947024197 +0000 UTC m=+0.124953564 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.968 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:09 compute-0 nova_compute[117514]: 2025-10-08 19:15:09.974 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.038 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.039 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.108 2 DEBUG oslo_concurrency.processutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.343 2 INFO nova.compute.manager [None req-e9073bc5-124f-4ca9-b785-d7b35abced05 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Get console output#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.349 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.359 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.360 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5786MB free_disk=73.35597610473633GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.360 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.360 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.446 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 5e004931-f1db-408c-9f7a-6c6c50c5f8ef actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.447 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Instance 2cd8a1e0-1eff-4f72-b839-340a50f3f21c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.447 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.448 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.515 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.529 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.559 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:15:10 compute-0 nova_compute[117514]: 2025-10-08 19:15:10.559 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:11.431 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:11.432 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.555 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.556 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.556 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.585 2 DEBUG nova.compute.manager [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.585 2 DEBUG nova.compute.manager [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.586 2 DEBUG oslo_concurrency.lockutils [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.586 2 DEBUG oslo_concurrency.lockutils [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.587 2 DEBUG nova.network.neutron [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:15:11 compute-0 nova_compute[117514]: 2025-10-08 19:15:11.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:15:12 compute-0 nova_compute[117514]: 2025-10-08 19:15:12.199 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:15:12 compute-0 nova_compute[117514]: 2025-10-08 19:15:12.602 2 INFO nova.compute.manager [None req-47bb72c8-0d18-4fcf-aba2-3c694aa24a37 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Get console output#033[00m
Oct  8 19:15:12 compute-0 nova_compute[117514]: 2025-10-08 19:15:12.609 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.374 2 DEBUG nova.network.neutron [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.375 2 DEBUG nova.network.neutron [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.393 2 DEBUG oslo_concurrency.lockutils [req-54561677-c8aa-44ec-ae59-06183821da70 req-b0f08bc3-b63e-49e3-aa39-babd3007c281 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.395 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.395 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.396 2 DEBUG nova.objects.instance [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 5e004931-f1db-408c-9f7a-6c6c50c5f8ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.661 2 DEBUG nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.662 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.663 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.663 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.664 2 DEBUG nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.664 2 WARNING nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.665 2 DEBUG nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.665 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.666 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.666 2 DEBUG oslo_concurrency.lockutils [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.667 2 DEBUG nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:13 compute-0 nova_compute[117514]: 2025-10-08 19:15:13.667 2 WARNING nova.compute.manager [req-757f53da-78be-43f6-8cc8-4e3610dd2568 req-007b4cb7-2009-4b31-9f0d-f8f66011b647 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.708 2 DEBUG nova.compute.manager [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.709 2 DEBUG nova.compute.manager [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.709 2 DEBUG oslo_concurrency.lockutils [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.813 2 INFO nova.compute.manager [None req-948129db-58f7-42b5-be85-39f515f6e1f5 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Get console output#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.819 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.821 2 DEBUG nova.network.neutron [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.836 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.837 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.837 2 DEBUG oslo_concurrency.lockutils [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.837 2 DEBUG nova.network.neutron [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.840 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.841 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:14 compute-0 nova_compute[117514]: 2025-10-08 19:15:14.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.683 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.683 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.683 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.684 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.684 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.685 2 INFO nova.compute.manager [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Terminating instance#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.686 2 DEBUG nova.compute.manager [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:15:15 compute-0 kernel: tap2139e839-c6 (unregistering): left promiscuous mode
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:15 compute-0 NetworkManager[1035]: <info>  [1759950915.7200] device (tap2139e839-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 ovn_controller[19759]: 2025-10-08T19:15:15Z|00148|binding|INFO|Releasing lport 2139e839-c698-494f-9fbc-5605baef1d1d from this chassis (sb_readonly=0)
Oct  8 19:15:15 compute-0 ovn_controller[19759]: 2025-10-08T19:15:15Z|00149|binding|INFO|Setting lport 2139e839-c698-494f-9fbc-5605baef1d1d down in Southbound
Oct  8 19:15:15 compute-0 ovn_controller[19759]: 2025-10-08T19:15:15Z|00150|binding|INFO|Removing iface tap2139e839-c6 ovn-installed in OVS
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.739 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:30:5a 10.100.0.6'], port_security=['fa:16:3e:22:30:5a 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '2cd8a1e0-1eff-4f72-b839-340a50f3f21c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9bd895d2-82c4-4fc5-81d5-e70c0a9516c6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=770536b4-68ae-4751-9b56-96d89b6bc561, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=2139e839-c698-494f-9fbc-5605baef1d1d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.741 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 2139e839-c698-494f-9fbc-5605baef1d1d in datapath 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac unbound from our chassis#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.743 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.758 2 DEBUG nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.758 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.758 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.758 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 WARNING nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG oslo_concurrency.lockutils [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.759 2 DEBUG nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.760 2 WARNING nova.compute.manager [req-97500e66-791d-4006-b8a5-283478a1dfde req-2e81e0ec-35d7-41a5-bfed-33976bae0a6c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.775 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[efbf1580-5f18-4106-9d0c-b6bab3013b54]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:15 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Oct  8 19:15:15 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 13.444s CPU time.
Oct  8 19:15:15 compute-0 systemd-machined[77568]: Machine qemu-12-instance-0000000c terminated.
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.815 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[69b0faff-efa0-456c-9fe0-e9aca62e53d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.821 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[cd5f187f-017c-464b-8a4a-afbab476ba4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.868 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[bd0d32c2-acb6-41a5-9998-29fac91f4350]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.900 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[84ff2abd-02cc-4f3f-a799-cc8aa38ebdff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6826b0cb-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:04:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154024, 'reachable_time': 41558, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150595, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.930 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[62d58742-c32a-4f2b-b821-32e8d78cde8d]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap6826b0cb-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154039, 'tstamp': 154039}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150598, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap6826b0cb-71'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 154043, 'tstamp': 154043}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150598, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.932 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6826b0cb-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.946 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6826b0cb-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.947 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.947 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6826b0cb-70, col_values=(('external_ids', {'iface-id': 'eabc4672-d176-4f11-b5f6-bcbea840c3e8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:15 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:15.948 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.968 2 INFO nova.virt.libvirt.driver [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance destroyed successfully.#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.969 2 DEBUG nova.objects.instance [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 2cd8a1e0-1eff-4f72-b839-340a50f3f21c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.983 2 DEBUG nova.virt.libvirt.vif [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:14:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2104709800',display_name='tempest-TestNetworkBasicOps-server-2104709800',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2104709800',id=12,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD7KRl2SW48tLsDGtdUZXstQI0RJAgkIMeGypW4KhorPNM5dX0aheM9ROODmr544NnSbnVhZPkTpmB3kqR7fi9vzFVS1BaUwNIB2s1Cu3kNzwW4pHA+avxmDokcR+QqgSQ==',key_name='tempest-TestNetworkBasicOps-1494317570',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:14:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-vtc0uukp',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:14:46Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=2cd8a1e0-1eff-4f72-b839-340a50f3f21c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.983 2 DEBUG nova.network.os_vif_util [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "2139e839-c698-494f-9fbc-5605baef1d1d", "address": "fa:16:3e:22:30:5a", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2139e839-c6", "ovs_interfaceid": "2139e839-c698-494f-9fbc-5605baef1d1d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.985 2 DEBUG nova.network.os_vif_util [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.985 2 DEBUG os_vif [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.988 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2139e839-c6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.996 2 INFO os_vif [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:22:30:5a,bridge_name='br-int',has_traffic_filtering=True,id=2139e839-c698-494f-9fbc-5605baef1d1d,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2139e839-c6')#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.997 2 INFO nova.virt.libvirt.driver [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Deleting instance files /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c_del#033[00m
Oct  8 19:15:15 compute-0 nova_compute[117514]: 2025-10-08 19:15:15.998 2 INFO nova.virt.libvirt.driver [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Deletion of /var/lib/nova/instances/2cd8a1e0-1eff-4f72-b839-340a50f3f21c_del complete#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.055 2 INFO nova.compute.manager [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.056 2 DEBUG oslo.service.loopingcall [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.056 2 DEBUG nova.compute.manager [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.056 2 DEBUG nova.network.neutron [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.222 2 DEBUG nova.network.neutron [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.223 2 DEBUG nova.network.neutron [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.238 2 DEBUG oslo_concurrency.lockutils [req-4d4b5d3f-bfd5-4876-8a22-9c955f3dfac5 req-e0b22032-c728-46d3-9aed-b54a8f80ae7d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.693 2 DEBUG nova.network.neutron [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.711 2 INFO nova.compute.manager [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Took 0.65 seconds to deallocate network for instance.#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.754 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.755 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.779 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.779 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing instance network info cache due to event network-changed-2139e839-c698-494f-9fbc-5605baef1d1d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.780 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.780 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.781 2 DEBUG nova.network.neutron [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Refreshing network info cache for port 2139e839-c698-494f-9fbc-5605baef1d1d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.843 2 DEBUG nova.compute.provider_tree [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.864 2 DEBUG nova.scheduler.client.report [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.885 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.909 2 INFO nova.scheduler.client.report [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 2cd8a1e0-1eff-4f72-b839-340a50f3f21c#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.912 2 DEBUG nova.network.neutron [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:15:16 compute-0 nova_compute[117514]: 2025-10-08 19:15:16.975 2 DEBUG oslo_concurrency.lockutils [None req-bd6484cc-a417-474d-9237-3aec456f4243 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.199 2 DEBUG nova.network.neutron [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.216 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-2cd8a1e0-1eff-4f72-b839-340a50f3f21c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.216 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-unplugged-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.217 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.218 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.218 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.219 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] No waiting events found dispatching network-vif-unplugged-2139e839-c698-494f-9fbc-5605baef1d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.220 2 WARNING nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received unexpected event network-vif-unplugged-2139e839-c698-494f-9fbc-5605baef1d1d for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.220 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.221 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.221 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.222 2 DEBUG oslo_concurrency.lockutils [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "2cd8a1e0-1eff-4f72-b839-340a50f3f21c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.223 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] No waiting events found dispatching network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.223 2 WARNING nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received unexpected event network-vif-plugged-2139e839-c698-494f-9fbc-5605baef1d1d for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.224 2 DEBUG nova.compute.manager [req-1b959095-0c45-4df7-89e0-099cf01f73ef req-98c395f5-0578-4c38-877c-3f2740c177b1 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Received event network-vif-deleted-2139e839-c698-494f-9fbc-5605baef1d1d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:17 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:17.434 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:17 compute-0 nova_compute[117514]: 2025-10-08 19:15:17.715 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.761 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.762 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.763 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.763 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.764 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.766 2 INFO nova.compute.manager [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Terminating instance#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.768 2 DEBUG nova.compute.manager [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:15:18 compute-0 kernel: tapae9e7968-10 (unregistering): left promiscuous mode
Oct  8 19:15:18 compute-0 NetworkManager[1035]: <info>  [1759950918.7959] device (tapae9e7968-10): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:15:18 compute-0 ovn_controller[19759]: 2025-10-08T19:15:18Z|00151|binding|INFO|Releasing lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 from this chassis (sb_readonly=0)
Oct  8 19:15:18 compute-0 ovn_controller[19759]: 2025-10-08T19:15:18Z|00152|binding|INFO|Setting lport ae9e7968-10b0-4606-9fa3-c91374cf1cc1 down in Southbound
Oct  8 19:15:18 compute-0 ovn_controller[19759]: 2025-10-08T19:15:18Z|00153|binding|INFO|Removing iface tapae9e7968-10 ovn-installed in OVS
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.819 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:50:87 10.100.0.4'], port_security=['fa:16:3e:23:50:87 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '5e004931-f1db-408c-9f7a-6c6c50c5f8ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'c3b607ea-9253-4328-bb00-668338c7a25d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=770536b4-68ae-4751-9b56-96d89b6bc561, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=ae9e7968-10b0-4606-9fa3-c91374cf1cc1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.822 28643 INFO neutron.agent.ovn.metadata.agent [-] Port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 in datapath 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac unbound from our chassis#033[00m
Oct  8 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.825 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.830 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a1351724-bc88-4969-a0d8-e347bb32ae15]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:18.831 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac namespace which is not needed anymore#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.879 2 DEBUG nova.compute.manager [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.881 2 DEBUG nova.compute.manager [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing instance network info cache due to event network-changed-ae9e7968-10b0-4606-9fa3-c91374cf1cc1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.881 2 DEBUG oslo_concurrency.lockutils [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.881 2 DEBUG oslo_concurrency.lockutils [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.881 2 DEBUG nova.network.neutron [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Refreshing network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:15:18 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct  8 19:15:18 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 14.004s CPU time.
Oct  8 19:15:18 compute-0 systemd-machined[77568]: Machine qemu-11-instance-0000000b terminated.
Oct  8 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [NOTICE]   (150177) : haproxy version is 2.8.14-c23fe91
Oct  8 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [NOTICE]   (150177) : path to executable is /usr/sbin/haproxy
Oct  8 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [WARNING]  (150177) : Exiting Master process...
Oct  8 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [ALERT]    (150177) : Current worker (150179) exited with code 143 (Terminated)
Oct  8 19:15:18 compute-0 neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac[150173]: [WARNING]  (150177) : All workers exited. Exiting... (0)
Oct  8 19:15:18 compute-0 systemd[1]: libpod-fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b.scope: Deactivated successfully.
Oct  8 19:15:18 compute-0 podman[150637]: 2025-10-08 19:15:18.979531394 +0000 UTC m=+0.046479821 container died fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:18 compute-0 nova_compute[117514]: 2025-10-08 19:15:18.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-d6837e9d7ec807fa0b782d57371b150b504d3cac3a36af379f52c92234713c15-merged.mount: Deactivated successfully.
Oct  8 19:15:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b-userdata-shm.mount: Deactivated successfully.
Oct  8 19:15:19 compute-0 podman[150637]: 2025-10-08 19:15:19.038778192 +0000 UTC m=+0.105726589 container cleanup fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.038 2 INFO nova.virt.libvirt.driver [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Instance destroyed successfully.#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.039 2 DEBUG nova.objects.instance [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 5e004931-f1db-408c-9f7a-6c6c50c5f8ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.052 2 DEBUG nova.virt.libvirt.vif [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-103133275',display_name='tempest-TestNetworkBasicOps-server-103133275',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-103133275',id=11,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI5y6d80fHySET4pCbLeqyj0cyDTZn6hTOGziG7pCiD92qFDw7Uq+y0suIKpGvDK2QOm6VBv2vJI5Io6WjjxteICCSlzmOgxu+CdOrYx2YA1B+bI4ndO5c+cp00qcb4ncw==',key_name='tempest-TestNetworkBasicOps-286586540',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:14:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-cyi34c6v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:14:30Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=5e004931-f1db-408c-9f7a-6c6c50c5f8ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:15:19 compute-0 systemd[1]: libpod-conmon-fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b.scope: Deactivated successfully.
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.053 2 DEBUG nova.network.os_vif_util [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.054 2 DEBUG nova.network.os_vif_util [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.054 2 DEBUG os_vif [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.057 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapae9e7968-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.064 2 INFO os_vif [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:23:50:87,bridge_name='br-int',has_traffic_filtering=True,id=ae9e7968-10b0-4606-9fa3-c91374cf1cc1,network=Network(6826b0cb-7eaf-4468-bf17-e3c581bfc4ac),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapae9e7968-10')#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.065 2 INFO nova.virt.libvirt.driver [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Deleting instance files /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef_del#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.066 2 INFO nova.virt.libvirt.driver [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Deletion of /var/lib/nova/instances/5e004931-f1db-408c-9f7a-6c6c50c5f8ef_del complete#033[00m
Oct  8 19:15:19 compute-0 podman[150684]: 2025-10-08 19:15:19.113007442 +0000 UTC m=+0.048313804 container remove fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.113 2 INFO nova.compute.manager [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.113 2 DEBUG oslo.service.loopingcall [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.114 2 DEBUG nova.compute.manager [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.114 2 DEBUG nova.network.neutron [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.120 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[4c9f01e3-0cfe-42ab-b1c3-ded22fc45547]: (4, ('Wed Oct  8 07:15:18 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac (fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b)\nfd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b\nWed Oct  8 07:15:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac (fd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b)\nfd1984bbb04b2d37335257684b95c287df155b6e2e004efa2c044877a08a6b2b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.122 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aac412aa-0e03-47af-a077-d0a1cb6f7e52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.123 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6826b0cb-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:19 compute-0 kernel: tap6826b0cb-70: left promiscuous mode
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.154 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[385b4dd4-0246-4474-b917-863e4a20df72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.187 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[77f3bc3d-3bc1-49ef-83f7-6e7649b84a24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.189 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[92fa3b3d-ed4d-4901-8643-21aa951f7078]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.209 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[30261ff5-3392-4bf2-8dec-3a579754bdf3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 154017, 'reachable_time': 32844, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150700, 'error': None, 'target': 'ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.213 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6826b0cb-7eaf-4468-bf17-e3c581bfc4ac deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:15:19 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:19.213 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[d11a7bf6-9037-4b7b-a17d-9b3fd5598bbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d6826b0cb\x2d7eaf\x2d4468\x2dbf17\x2de3c581bfc4ac.mount: Deactivated successfully.
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.262 2 DEBUG nova.compute.manager [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.263 2 DEBUG oslo_concurrency.lockutils [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.263 2 DEBUG oslo_concurrency.lockutils [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.263 2 DEBUG oslo_concurrency.lockutils [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.263 2 DEBUG nova.compute.manager [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.264 2 DEBUG nova.compute.manager [req-4db90871-fc73-4108-8839-ed9842217211 req-168b42c7-1d0d-42e0-af0f-0a9ac648938b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-unplugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.801 2 DEBUG nova.network.neutron [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.816 2 INFO nova.compute.manager [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Took 0.70 seconds to deallocate network for instance.#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.865 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.865 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.913 2 DEBUG nova.compute.provider_tree [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.928 2 DEBUG nova.scheduler.client.report [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.951 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:19 compute-0 nova_compute[117514]: 2025-10-08 19:15:19.973 2 INFO nova.scheduler.client.report [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 5e004931-f1db-408c-9f7a-6c6c50c5f8ef#033[00m
Oct  8 19:15:20 compute-0 nova_compute[117514]: 2025-10-08 19:15:20.037 2 DEBUG oslo_concurrency.lockutils [None req-f23efa21-17b0-445c-9927-af056d4d06b7 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:20 compute-0 nova_compute[117514]: 2025-10-08 19:15:20.233 2 DEBUG nova.network.neutron [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updated VIF entry in instance network info cache for port ae9e7968-10b0-4606-9fa3-c91374cf1cc1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:15:20 compute-0 nova_compute[117514]: 2025-10-08 19:15:20.233 2 DEBUG nova.network.neutron [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Updating instance_info_cache with network_info: [{"id": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "address": "fa:16:3e:23:50:87", "network": {"id": "6826b0cb-7eaf-4468-bf17-e3c581bfc4ac", "bridge": "br-int", "label": "tempest-network-smoke--1582861562", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapae9e7968-10", "ovs_interfaceid": "ae9e7968-10b0-4606-9fa3-c91374cf1cc1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:20 compute-0 nova_compute[117514]: 2025-10-08 19:15:20.254 2 DEBUG oslo_concurrency.lockutils [req-00b34679-df5a-49ec-91f6-3b0f8940b1b5 req-1ed3953d-d099-40df-bfe7-1c8818fcaccb bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-5e004931-f1db-408c-9f7a-6c6c50c5f8ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.354 2 DEBUG nova.compute.manager [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.355 2 DEBUG oslo_concurrency.lockutils [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.355 2 DEBUG oslo_concurrency.lockutils [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.356 2 DEBUG oslo_concurrency.lockutils [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "5e004931-f1db-408c-9f7a-6c6c50c5f8ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.356 2 DEBUG nova.compute.manager [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] No waiting events found dispatching network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.357 2 WARNING nova.compute.manager [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received unexpected event network-vif-plugged-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.357 2 DEBUG nova.compute.manager [req-48566298-102e-41f1-84cb-ba28998cbea0 req-8c223513-6006-41fc-b52a-f18526bd888c bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Received event network-vif-deleted-ae9e7968-10b0-4606-9fa3-c91374cf1cc1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:21 compute-0 nova_compute[117514]: 2025-10-08 19:15:21.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:21 compute-0 podman[150701]: 2025-10-08 19:15:21.689543543 +0000 UTC m=+0.095794074 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:15:22 compute-0 nova_compute[117514]: 2025-10-08 19:15:22.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:22 compute-0 nova_compute[117514]: 2025-10-08 19:15:22.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:24 compute-0 nova_compute[117514]: 2025-10-08 19:15:24.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:26 compute-0 nova_compute[117514]: 2025-10-08 19:15:26.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:29 compute-0 nova_compute[117514]: 2025-10-08 19:15:29.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:30 compute-0 nova_compute[117514]: 2025-10-08 19:15:30.967 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950915.9657428, 2cd8a1e0-1eff-4f72-b839-340a50f3f21c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:15:30 compute-0 nova_compute[117514]: 2025-10-08 19:15:30.967 2 INFO nova.compute.manager [-] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:15:30 compute-0 nova_compute[117514]: 2025-10-08 19:15:30.993 2 DEBUG nova.compute.manager [None req-732071df-72e8-413e-b3f7-43bc5c6d67cd - - - - - -] [instance: 2cd8a1e0-1eff-4f72-b839-340a50f3f21c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:15:31 compute-0 nova_compute[117514]: 2025-10-08 19:15:31.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:31 compute-0 podman[150726]: 2025-10-08 19:15:31.713788796 +0000 UTC m=+0.120823875 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 19:15:34 compute-0 nova_compute[117514]: 2025-10-08 19:15:34.035 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950919.0330064, 5e004931-f1db-408c-9f7a-6c6c50c5f8ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:15:34 compute-0 nova_compute[117514]: 2025-10-08 19:15:34.036 2 INFO nova.compute.manager [-] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:15:34 compute-0 nova_compute[117514]: 2025-10-08 19:15:34.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:34 compute-0 nova_compute[117514]: 2025-10-08 19:15:34.065 2 DEBUG nova.compute.manager [None req-31b0b051-8b49-4c51-b73f-02e5a245b715 - - - - - -] [instance: 5e004931-f1db-408c-9f7a-6c6c50c5f8ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:15:36 compute-0 nova_compute[117514]: 2025-10-08 19:15:36.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:37 compute-0 podman[150747]: 2025-10-08 19:15:37.684224384 +0000 UTC m=+0.084069545 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible)
Oct  8 19:15:37 compute-0 podman[150746]: 2025-10-08 19:15:37.684243324 +0000 UTC m=+0.098478900 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7)
Oct  8 19:15:37 compute-0 podman[150748]: 2025-10-08 19:15:37.698466754 +0000 UTC m=+0.101371314 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:15:39 compute-0 nova_compute[117514]: 2025-10-08 19:15:39.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:40 compute-0 podman[150809]: 2025-10-08 19:15:40.648848894 +0000 UTC m=+0.053626898 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 19:15:40 compute-0 podman[150807]: 2025-10-08 19:15:40.693590274 +0000 UTC m=+0.098239144 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  8 19:15:40 compute-0 podman[150808]: 2025-10-08 19:15:40.767960179 +0000 UTC m=+0.165142094 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.538 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.539 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.558 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.639 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.640 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.651 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.651 2 INFO nova.compute.claims [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Claim successful on node compute-0.ctlplane.example.com#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.788 2 DEBUG nova.compute.provider_tree [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.808 2 DEBUG nova.scheduler.client.report [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.832 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.833 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.887 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.888 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.907 2 INFO nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  8 19:15:41 compute-0 nova_compute[117514]: 2025-10-08 19:15:41.928 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.033 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.035 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.036 2 INFO nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Creating image(s)#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.037 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.038 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.039 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.061 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.149 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.150 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "008eb3078b811ee47058b7252a820910c35fc6df" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.151 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.173 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.230 2 DEBUG nova.policy [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.253 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.254 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.296 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df,backing_fmt=raw /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.297 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "008eb3078b811ee47058b7252a820910c35fc6df" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.298 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.379 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/008eb3078b811ee47058b7252a820910c35fc6df --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.381 2 DEBUG nova.virt.disk.api [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Checking if we can resize image /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.382 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.476 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.478 2 DEBUG nova.virt.disk.api [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Cannot resize image /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.479 2 DEBUG nova.objects.instance [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'migration_context' on Instance uuid 34ca788d-2398-4a40-9f96-040c0849b18f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.493 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.494 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Ensure instance console log exists: /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.494 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.495 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:42 compute-0 nova_compute[117514]: 2025-10-08 19:15:42.495 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:43 compute-0 nova_compute[117514]: 2025-10-08 19:15:43.324 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Successfully created port: 06998e1e-8ce7-484d-b3e4-7d44699229c4 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.003 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Successfully updated port: 06998e1e-8ce7-484d-b3e4-7d44699229c4 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.028 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.029 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquired lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.029 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.100 2 DEBUG nova.compute.manager [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.101 2 DEBUG nova.compute.manager [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing instance network info cache due to event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.102 2 DEBUG oslo_concurrency.lockutils [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.198 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  8 19:15:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.882 2 DEBUG nova.network.neutron [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.906 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Releasing lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.906 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance network_info: |[{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.907 2 DEBUG oslo_concurrency.lockutils [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.907 2 DEBUG nova.network.neutron [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.912 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start _get_guest_xml network_info=[{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_secret_uuid': None, 'size': 0, 'boot_index': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_options': None, 'image_id': '23cfa426-7011-4566-992d-1c7af39f70dd'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.918 2 WARNING nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.924 2 DEBUG nova.virt.libvirt.host [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.925 2 DEBUG nova.virt.libvirt.host [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.936 2 DEBUG nova.virt.libvirt.host [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.937 2 DEBUG nova.virt.libvirt.host [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.938 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.939 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-08T19:05:09Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='e8a148fc-4419-4813-98ff-a17e2a95609e',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-08T19:05:11Z,direct_url=<?>,disk_format='qcow2',id=23cfa426-7011-4566-992d-1c7af39f70dd,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0776a2a010754884a7b224f3b08ef53b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-08T19:05:12Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.940 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.941 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.941 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.942 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.942 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.943 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.943 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.944 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.944 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.945 2 DEBUG nova.virt.hardware [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.951 2 DEBUG nova.virt.libvirt.vif [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:15:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-91776509',display_name='tempest-TestNetworkBasicOps-server-91776509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-91776509',id=13,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8PTrGv1QybFIubtsg8lczGea0IvQL8pvhihemAZSj0UMnf1scRH00KmJvAMVhcwpSfJBSsSB9h8z57cU6NeYho/jEOEiMidDlTZU4qxsLiPufykBInXUSkP3hGqOiJaw==',key_name='tempest-TestNetworkBasicOps-916834063',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-q8bisj0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:15:41Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=34ca788d-2398-4a40-9f96-040c0849b18f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.951 2 DEBUG nova.network.os_vif_util [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.953 2 DEBUG nova.network.os_vif_util [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.955 2 DEBUG nova.objects.instance [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 34ca788d-2398-4a40-9f96-040c0849b18f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.969 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] End _get_guest_xml xml=<domain type="kvm">
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <uuid>34ca788d-2398-4a40-9f96-040c0849b18f</uuid>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <name>instance-0000000d</name>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <memory>131072</memory>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <vcpu>1</vcpu>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <metadata>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <nova:name>tempest-TestNetworkBasicOps-server-91776509</nova:name>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <nova:creationTime>2025-10-08 19:15:44</nova:creationTime>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <nova:flavor name="m1.nano">
Oct  8 19:15:44 compute-0 nova_compute[117514]:        <nova:memory>128</nova:memory>
Oct  8 19:15:44 compute-0 nova_compute[117514]:        <nova:disk>1</nova:disk>
Oct  8 19:15:44 compute-0 nova_compute[117514]:        <nova:swap>0</nova:swap>
Oct  8 19:15:44 compute-0 nova_compute[117514]:        <nova:ephemeral>0</nova:ephemeral>
Oct  8 19:15:44 compute-0 nova_compute[117514]:        <nova:vcpus>1</nova:vcpus>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      </nova:flavor>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <nova:owner>
Oct  8 19:15:44 compute-0 nova_compute[117514]:        <nova:user uuid="efdb1424acdb478684cdb088b373ba05">tempest-TestNetworkBasicOps-1122149477-project-member</nova:user>
Oct  8 19:15:44 compute-0 nova_compute[117514]:        <nova:project uuid="b7f7c752a9c5498f8eda73e461895ac9">tempest-TestNetworkBasicOps-1122149477</nova:project>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      </nova:owner>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <nova:root type="image" uuid="23cfa426-7011-4566-992d-1c7af39f70dd"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <nova:ports>
Oct  8 19:15:44 compute-0 nova_compute[117514]:        <nova:port uuid="06998e1e-8ce7-484d-b3e4-7d44699229c4">
Oct  8 19:15:44 compute-0 nova_compute[117514]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:        </nova:port>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      </nova:ports>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </nova:instance>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  </metadata>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <sysinfo type="smbios">
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <system>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <entry name="manufacturer">RDO</entry>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <entry name="product">OpenStack Compute</entry>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <entry name="serial">34ca788d-2398-4a40-9f96-040c0849b18f</entry>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <entry name="uuid">34ca788d-2398-4a40-9f96-040c0849b18f</entry>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <entry name="family">Virtual Machine</entry>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </system>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  </sysinfo>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <os>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <boot dev="hd"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <smbios mode="sysinfo"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  </os>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <features>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <acpi/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <apic/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <vmcoreinfo/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  </features>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <clock offset="utc">
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <timer name="pit" tickpolicy="delay"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <timer name="hpet" present="no"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  </clock>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <cpu mode="host-model" match="exact">
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <topology sockets="1" cores="1" threads="1"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  </cpu>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  <devices>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <disk type="file" device="disk">
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <target dev="vda" bus="virtio"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <disk type="file" device="cdrom">
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <driver name="qemu" type="raw" cache="none"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <source file="/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.config"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <target dev="sda" bus="sata"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </disk>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <interface type="ethernet">
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <mac address="fa:16:3e:66:c0:df"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <driver name="vhost" rx_queue_size="512"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <mtu size="1442"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <target dev="tap06998e1e-8c"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </interface>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <serial type="pty">
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <log file="/var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/console.log" append="off"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </serial>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <video>
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <model type="virtio"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </video>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <input type="tablet" bus="usb"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <rng model="virtio">
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <backend model="random">/dev/urandom</backend>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </rng>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="pci" model="pcie-root-port"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <controller type="usb" index="0"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    <memballoon model="virtio">
Oct  8 19:15:44 compute-0 nova_compute[117514]:      <stats period="10"/>
Oct  8 19:15:44 compute-0 nova_compute[117514]:    </memballoon>
Oct  8 19:15:44 compute-0 nova_compute[117514]:  </devices>
Oct  8 19:15:44 compute-0 nova_compute[117514]: </domain>
Oct  8 19:15:44 compute-0 nova_compute[117514]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.971 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Preparing to wait for external event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.972 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.972 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.972 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.973 2 DEBUG nova.virt.libvirt.vif [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-08T19:15:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-91776509',display_name='tempest-TestNetworkBasicOps-server-91776509',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-91776509',id=13,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8PTrGv1QybFIubtsg8lczGea0IvQL8pvhihemAZSj0UMnf1scRH00KmJvAMVhcwpSfJBSsSB9h8z57cU6NeYho/jEOEiMidDlTZU4qxsLiPufykBInXUSkP3hGqOiJaw==',key_name='tempest-TestNetworkBasicOps-916834063',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-q8bisj0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-08T19:15:41Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=34ca788d-2398-4a40-9f96-040c0849b18f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.973 2 DEBUG nova.network.os_vif_util [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.974 2 DEBUG nova.network.os_vif_util [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.974 2 DEBUG os_vif [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.975 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.976 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.982 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap06998e1e-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.983 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap06998e1e-8c, col_values=(('external_ids', {'iface-id': '06998e1e-8ce7-484d-b3e4-7d44699229c4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:66:c0:df', 'vm-uuid': '34ca788d-2398-4a40-9f96-040c0849b18f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:44 compute-0 NetworkManager[1035]: <info>  [1759950944.9873] manager: (tap06998e1e-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:44 compute-0 nova_compute[117514]: 2025-10-08 19:15:44.995 2 INFO os_vif [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c')#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.052 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.053 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.053 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] No VIF found with MAC fa:16:3e:66:c0:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.054 2 INFO nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Using config drive#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.376 2 INFO nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Creating config drive at /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.config#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.380 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5gl3_4fm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.509 2 DEBUG oslo_concurrency.processutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5gl3_4fm" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:15:45 compute-0 kernel: tap06998e1e-8c: entered promiscuous mode
Oct  8 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.5844] manager: (tap06998e1e-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Oct  8 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00154|binding|INFO|Claiming lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 for this chassis.
Oct  8 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00155|binding|INFO|06998e1e-8ce7-484d-b3e4-7d44699229c4: Claiming fa:16:3e:66:c0:df 10.100.0.6
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.596 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:c0:df 10.100.0.6'], port_security=['fa:16:3e:66:c0:df 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed492f30-88ab-4074-a37b-2efd9113a46f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0aa04153-3da7-40f5-b74d-f2ebacf56fd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d714e95d-17df-46e0-aa89-985c7cbd12a3, chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=06998e1e-8ce7-484d-b3e4-7d44699229c4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.598 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 06998e1e-8ce7-484d-b3e4-7d44699229c4 in datapath ed492f30-88ab-4074-a37b-2efd9113a46f bound to our chassis#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.598 28643 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ed492f30-88ab-4074-a37b-2efd9113a46f#033[00m
Oct  8 19:15:45 compute-0 systemd-udevd[150905]: Network interface NamePolicy= disabled on kernel command line.
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.614 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[15079f8c-5e21-4d61-93e8-0c60ca29032e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.615 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH taped492f30-81 in ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.617 144726 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface taped492f30-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.618 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f978ecb8-8be6-4ae0-b63e-4fd0af4996c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.619 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[e83eda28-f06c-4025-bbe3-c48ff2fb6bca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.6278] device (tap06998e1e-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  8 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.6287] device (tap06998e1e-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.631 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[199b563c-4f4d-4804-8657-13bf37b87572]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 systemd-machined[77568]: New machine qemu-13-instance-0000000d.
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00156|binding|INFO|Setting lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 ovn-installed in OVS
Oct  8 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00157|binding|INFO|Setting lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 up in Southbound
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:45 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.661 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[9dac3c58-0591-4f86-be5b-603c37d49423]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.689 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[6387f2fe-ba50-43d1-9fb6-4781c043af30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.696 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c43320-beb2-4d3f-b834-607cb293eece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.6975] manager: (taped492f30-80): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.738 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[8965ab01-6495-4048-a5a1-841389b5687d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.742 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[d6d9ab24-fd50-4c71-b6d1-9f6b8912041e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.7731] device (taped492f30-80): carrier: link connected
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.780 144740 DEBUG oslo.privsep.daemon [-] privsep: reply[fc96d55b-13ad-42f6-a435-773fd67ce923]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.802 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ed5f33b7-0332-4ba6-a70f-1157cc5dc7c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped492f30-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:6a:4b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 161635, 'reachable_time': 15437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 150938, 'error': None, 'target': 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.826 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[761217d0-2e59-42e6-b9c4-92ca903dd05d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:6a4b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 161635, 'tstamp': 161635}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 150939, 'error': None, 'target': 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.846 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[a13a8510-7330-403b-8a36-8b465c9699e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'taped492f30-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:6a:4b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 161635, 'reachable_time': 15437, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 150940, 'error': None, 'target': 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.865 2 DEBUG nova.compute.manager [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.866 2 DEBUG oslo_concurrency.lockutils [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.867 2 DEBUG oslo_concurrency.lockutils [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.867 2 DEBUG oslo_concurrency.lockutils [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.868 2 DEBUG nova.compute.manager [req-2a1799c4-0291-4e7e-a2f1-31a7ce77551a req-09d5a077-4d9d-4cbb-beb6-cd3a59414b2d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Processing event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.890 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[aa30b568-56de-4920-9b74-6b3f82f018d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.926 2 DEBUG nova.network.neutron [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updated VIF entry in instance network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.927 2 DEBUG nova.network.neutron [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.956 2 DEBUG oslo_concurrency.lockutils [req-7046266a-73fc-4410-b39a-7bc6485e7f66 req-1e78eae3-271a-4812-8323-eab30103ec5d bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.983 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[39dff74f-650d-4dda-a2e6-f61d55573330]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.985 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped492f30-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.986 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.987 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=taped492f30-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:45 compute-0 kernel: taped492f30-80: entered promiscuous mode
Oct  8 19:15:45 compute-0 NetworkManager[1035]: <info>  [1759950945.9904] manager: (taped492f30-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Oct  8 19:15:45 compute-0 nova_compute[117514]: 2025-10-08 19:15:45.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:45 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:45.993 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=taped492f30-80, col_values=(('external_ids', {'iface-id': '58bfd3a1-f863-472a-ae8b-afc52524c7cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:15:45 compute-0 ovn_controller[19759]: 2025-10-08T19:15:45Z|00158|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:46.022 28643 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ed492f30-88ab-4074-a37b-2efd9113a46f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ed492f30-88ab-4074-a37b-2efd9113a46f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:46.023 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[69485668-5361-4546-abad-5f9b309fc1c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:46.024 28643 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: global
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    log         /dev/log local0 debug
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    log-tag     haproxy-metadata-proxy-ed492f30-88ab-4074-a37b-2efd9113a46f
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    user        root
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    group       root
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    maxconn     1024
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    pidfile     /var/lib/neutron/external/pids/ed492f30-88ab-4074-a37b-2efd9113a46f.pid.haproxy
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    daemon
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: defaults
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    log global
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    mode http
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    option httplog
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    option dontlognull
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    option http-server-close
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    option forwardfor
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    retries                 3
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    timeout http-request    30s
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    timeout connect         30s
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    timeout client          32s
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    timeout server          32s
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    timeout http-keep-alive 30s
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: 
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: listen listener
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    bind 169.254.169.254:80
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    server metadata /var/lib/neutron/metadata_proxy
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]:    http-request add-header X-OVN-Network-ID ed492f30-88ab-4074-a37b-2efd9113a46f
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  8 19:15:46 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:15:46.025 28643 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'env', 'PROCESS_TAG=haproxy-ed492f30-88ab-4074-a37b-2efd9113a46f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ed492f30-88ab-4074-a37b-2efd9113a46f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  8 19:15:46 compute-0 podman[150977]: 2025-10-08 19:15:46.412511051 +0000 UTC m=+0.047066989 container create 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:46 compute-0 podman[150977]: 2025-10-08 19:15:46.38993212 +0000 UTC m=+0.024488068 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7
Oct  8 19:15:46 compute-0 systemd[1]: Started libpod-conmon-646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2.scope.
Oct  8 19:15:46 compute-0 systemd[1]: Started libcrun container.
Oct  8 19:15:46 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfed095f12a30ca38c2adabcd7edcb3837429b20db6e6f549181366129732c0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  8 19:15:46 compute-0 podman[150977]: 2025-10-08 19:15:46.536463085 +0000 UTC m=+0.171019073 container init 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:15:46 compute-0 podman[150977]: 2025-10-08 19:15:46.543406625 +0000 UTC m=+0.177962573 container start 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:15:46 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [NOTICE]   (150997) : New worker (150999) forked
Oct  8 19:15:46 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [NOTICE]   (150997) : Loading success.
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.578 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950946.5780487, 34ca788d-2398-4a40-9f96-040c0849b18f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.578 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] VM Started (Lifecycle Event)#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.581 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.584 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.587 2 INFO nova.virt.libvirt.driver [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance spawned successfully.#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.588 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.615 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.621 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.625 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.625 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.626 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.626 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.627 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.627 2 DEBUG nova.virt.libvirt.driver [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.669 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.670 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950946.5803564, 34ca788d-2398-4a40-9f96-040c0849b18f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.671 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] VM Paused (Lifecycle Event)#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.695 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.699 2 DEBUG nova.virt.driver [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] Emitting event <LifecycleEvent: 1759950946.5837963, 34ca788d-2398-4a40-9f96-040c0849b18f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.699 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] VM Resumed (Lifecycle Event)#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.703 2 INFO nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Took 4.67 seconds to spawn the instance on the hypervisor.#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.704 2 DEBUG nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.716 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.720 2 DEBUG nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.746 2 INFO nova.compute.manager [None req-8fc4cda7-d7fb-4380-8ed3-691abd81aad3 - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.766 2 INFO nova.compute.manager [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Took 5.16 seconds to build instance.#033[00m
Oct  8 19:15:46 compute-0 nova_compute[117514]: 2025-10-08 19:15:46.784 2 DEBUG oslo_concurrency.lockutils [None req-1f65272c-6a6c-49d2-b755-270efc401410 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.967 2 DEBUG nova.compute.manager [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.967 2 DEBUG oslo_concurrency.lockutils [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.967 2 DEBUG oslo_concurrency.lockutils [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.968 2 DEBUG oslo_concurrency.lockutils [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.968 2 DEBUG nova.compute.manager [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] No waiting events found dispatching network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:15:47 compute-0 nova_compute[117514]: 2025-10-08 19:15:47.968 2 WARNING nova.compute.manager [req-61f8b5ca-003c-4ef2-a0f7-470b907fa277 req-a897d1ba-f550-4274-8459-41bd23f0df0f bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received unexpected event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 for instance with vm_state active and task_state None.#033[00m
Oct  8 19:15:49 compute-0 nova_compute[117514]: 2025-10-08 19:15:49.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:51 compute-0 nova_compute[117514]: 2025-10-08 19:15:51.449 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:52 compute-0 podman[151008]: 2025-10-08 19:15:52.667203704 +0000 UTC m=+0.080804991 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:15:53 compute-0 ovn_controller[19759]: 2025-10-08T19:15:53Z|00159|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct  8 19:15:53 compute-0 NetworkManager[1035]: <info>  [1759950953.3017] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Oct  8 19:15:53 compute-0 NetworkManager[1035]: <info>  [1759950953.3027] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Oct  8 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:53 compute-0 ovn_controller[19759]: 2025-10-08T19:15:53Z|00160|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct  8 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.652 2 DEBUG nova.compute.manager [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.652 2 DEBUG nova.compute.manager [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing instance network info cache due to event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.653 2 DEBUG oslo_concurrency.lockutils [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.653 2 DEBUG oslo_concurrency.lockutils [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:15:53 compute-0 nova_compute[117514]: 2025-10-08 19:15:53.653 2 DEBUG nova.network.neutron [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:15:54 compute-0 nova_compute[117514]: 2025-10-08 19:15:54.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:55 compute-0 nova_compute[117514]: 2025-10-08 19:15:55.640 2 DEBUG nova.network.neutron [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updated VIF entry in instance network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:15:55 compute-0 nova_compute[117514]: 2025-10-08 19:15:55.640 2 DEBUG nova.network.neutron [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:15:55 compute-0 nova_compute[117514]: 2025-10-08 19:15:55.660 2 DEBUG oslo_concurrency.lockutils [req-c9da2321-86a2-4eda-b592-9ca32225bc02 req-49c1532a-ea9d-47d3-9868-3dd390e80aec bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:15:56 compute-0 nova_compute[117514]: 2025-10-08 19:15:56.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:15:57 compute-0 ovn_controller[19759]: 2025-10-08T19:15:57Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:66:c0:df 10.100.0.6
Oct  8 19:15:57 compute-0 ovn_controller[19759]: 2025-10-08T19:15:57Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:66:c0:df 10.100.0.6
Oct  8 19:15:59 compute-0 nova_compute[117514]: 2025-10-08 19:15:59.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:01 compute-0 nova_compute[117514]: 2025-10-08 19:16:01.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:02 compute-0 podman[151039]: 2025-10-08 19:16:02.68569629 +0000 UTC m=+0.100255502 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  8 19:16:04 compute-0 nova_compute[117514]: 2025-10-08 19:16:04.038 2 INFO nova.compute.manager [None req-cf079058-f559-43bc-80a7-66aca8d75b7f efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Get console output#033[00m
Oct  8 19:16:04 compute-0 nova_compute[117514]: 2025-10-08 19:16:04.044 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:16:04 compute-0 nova_compute[117514]: 2025-10-08 19:16:04.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:06 compute-0 nova_compute[117514]: 2025-10-08 19:16:06.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:07 compute-0 ovn_controller[19759]: 2025-10-08T19:16:07Z|00161|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct  8 19:16:07 compute-0 nova_compute[117514]: 2025-10-08 19:16:07.217 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:07 compute-0 ovn_controller[19759]: 2025-10-08T19:16:07Z|00162|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct  8 19:16:07 compute-0 nova_compute[117514]: 2025-10-08 19:16:07.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.247 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'name': 'tempest-TestNetworkBasicOps-server-91776509', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000d', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'hostId': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.275 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.latency volume: 2531671528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.276 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72dc93e4-4db4-4575-acbc-0c2e2b07582a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2531671528, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.248361', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e5df0fc-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '2cca6adb85a607eed5dab499add86d5b03e800faa49c3e816772b47916001841'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.248361', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e5dff98-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '25a16238e93801fd21bf067414414e3f74fa2eb3e56da4ad65d65f7dfc8401e3'}]}, 'timestamp': '2025-10-08 19:16:08.276700', '_unique_id': '21df5d5edc454087b4e47633c16b87b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.278 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.279 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.282 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 34ca788d-2398-4a40-9f96-040c0849b18f / tap06998e1e-8c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.282 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96771a21-8611-4f38-8a35-94424aa57293', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.279430', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e5efd26-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '5700ad9e7c26f163d2e0fbe4d921acbc0e6336fff5459f91a549d9b6d37171d9'}]}, 'timestamp': '2025-10-08 19:16:08.283192', '_unique_id': '19841d70531e4883975198841fad608f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.283 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.284 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.309 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/memory.usage volume: 46.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd59aca0c-2277-4ac8-85c9-b9e1a2bb1cb7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.76953125, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'timestamp': '2025-10-08T19:16:08.284401', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '3e6329e6-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.93888463, 'message_signature': '4a1d82699e05a1f4b51f0654251f7dd7ab16a200c2b38402e5630fcd7a81f930'}]}, 'timestamp': '2025-10-08 19:16:08.310672', '_unique_id': 'eb770c178df04038ad9eb1342c68a7a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.312 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.requests volume: 1095 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce5d6500-46a0-4717-b775-15f557ca001b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1095, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.313001', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e6393e0-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '155a16d03a70f7e8b006b4a5c6b86cbede419f6e682eda7487f3604ee58e1c7d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.313001', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e639c00-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': 'e5cc09377353420cb56a5ba1a3e2581cde9d4b135f6eb62dd8bdae56e0f9c2bc'}]}, 'timestamp': '2025-10-08 19:16:08.313438', '_unique_id': 'dc35bf077240451b8467291769306fee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.313 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.314 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.bytes volume: 4787 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b322823a-9dd8-4330-a632-81daf3fb8db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4787, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.314593', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e63d24c-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': 'fb056a64fc493f4dafa4ab1b1b64277432bca1582c91d8d11d74ebceb63b6066'}]}, 'timestamp': '2025-10-08 19:16:08.314878', '_unique_id': '2a64dac32fca4f22b157665029d2cbcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.315 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>]
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.latency volume: 544247269 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.316 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.latency volume: 129480851 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec0f7003-2378-41ac-b812-f33f9a6034a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 544247269, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.316351', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e6416d0-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '2396955e96e4f1eb7dab241e320d147af8b2dcd3e3c7aa246f4c90358ef6f70e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 129480851, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.316351', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e641f4a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '1b2fcbfe0ef43c5b8ee8ffcce760a0be58f73e9f3f154240e29da2be8ee36949'}]}, 'timestamp': '2025-10-08 19:16:08.316801', '_unique_id': 'b303460c716d433aa4209cac740f8693'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.317 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.319 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '476bf45b-8594-41e9-869f-f9eb34df3f0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.319364', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e648c0a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '2fcbb450a2ba4f36f339b9ed4a63299c68ee7cc96edb4c435d188325d46caa79'}]}, 'timestamp': '2025-10-08 19:16:08.319599', '_unique_id': 'a1bfac530f864186af172ebbdf152b1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.320 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4415440a-dbc4-4007-ad0f-5945dc35be65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.320781', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e64c40e-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '0ecb0f67182b8165754589fe2cce7d3e913cd56666d5d57c6fd2614fd79fe981'}]}, 'timestamp': '2025-10-08 19:16:08.321032', '_unique_id': '4f2703e7f82545c4a2b8382c27c882ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.321 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.322 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.bytes volume: 3418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0e564cc-3577-4dee-9bf8-a8e1813ce352', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3418, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.322363', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e65022a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': 'ed2a10d91fb35ae89864b9843209e8cf6c4d1360c00987bd42598d5716a6a0b1'}]}, 'timestamp': '2025-10-08 19:16:08.322658', '_unique_id': '163ab4d64cd440ea87b70d911ca8523a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.323 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cfc94b7-f0e2-488b-b1f5-2d5b2c86fc37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.324047', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e6543de-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': 'f0e4d7ae985b4e58b6ec6cf83f7c6f2dfcc914a5322e6d521ccdbbebee823cd3'}]}, 'timestamp': '2025-10-08 19:16:08.324340', '_unique_id': '8244c41ed88a4703a36e1e855f620c08'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.324 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.341 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.341 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a5a6437-4fa9-4249-a2cd-01597b4b24e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.325724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e67e008-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'e9c0729a8890df944a7fa07691c323a6783f301986a286585b90ba596ecc2ff3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.325724', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e67ea4e-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': '69faa9f89a4b48fa58723b30172ae680d9597365d937600f994222fe3a964020'}]}, 'timestamp': '2025-10-08 19:16:08.341684', '_unique_id': '6c8b0189508440f7bedee4e0cbb03c8d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.342 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.343 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.requests volume: 315 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.343 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51614514-a2af-40e6-ae6e-affdac1802a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 315, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.343260', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e683828-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '33b03d1c55485236214fba45e252be2721e906e517b7346ef6394b768635303f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.343260', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e6840fc-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '078822d540be3fd9e001570b0ab2af9b6bc556f6d1801dac4f2c6af3cd45b08e'}]}, 'timestamp': '2025-10-08 19:16:08.343901', '_unique_id': '6cbb75bc20394891af6b7802b9110c5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8027f9fc-34b1-4fdc-be7a-dbad4b396155', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 28, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.345041', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e687702-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '9b57970da1a606f3e1b680603172f5ec3d5c0d69c37fa8fbaf73550cd3ca66ce'}]}, 'timestamp': '2025-10-08 19:16:08.345278', '_unique_id': '1f9885cdb3e041c495c96367a07c70c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.345 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.346 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.346 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.packets volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad60ba67-b4aa-4bbe-b7c9-7ff7ff185fe6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 30, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.346339', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e68a970-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '376f391ee88204142899f4c96da426255ca6f8fae2e7637d7c0f672564a19a32'}]}, 'timestamp': '2025-10-08 19:16:08.346563', '_unique_id': '50160e4e2e2041eaa48ff6e95d2e8930'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>]
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86b663c9-6515-4ebf-ba39-902cf423d9f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.348033', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e68eba6-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'a9bb50635ea3052b4563da5b6bebba8f6cf7576ac33d1e1833865423c9a74d80'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.348033', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e68f470-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'c952668bfeb2b9a3acd2993c5c7e4d5cb6ebc0eecaf406b575e9464edeb09b6d'}]}, 'timestamp': '2025-10-08 19:16:08.348470', '_unique_id': '2522ece249f94cf995c7f06a02ba2a42'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.348 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.349 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.349 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/cpu volume: 10390000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a7e66e6-c9a9-4987-b71a-7e35cd7f3ed8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10390000000, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'timestamp': '2025-10-08T19:16:08.349688', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '3e692c6a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.93888463, 'message_signature': 'b91d0c66008a5cad45e0d1470c16cb16ba4d89ba1d34ad4ead0383f97cd9098d'}]}, 'timestamp': '2025-10-08 19:16:08.349928', '_unique_id': 'e4c48e6c2e50436a82c44b362653319b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.350 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>]
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.bytes volume: 72921088 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.351 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aaa7ad5f-aca7-4002-ae9a-4e219493f152', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72921088, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.351240', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e6968c4-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': '41db504c1c967a82720ac6d6305a1507d31cdd3c057a5cc55bb21432a93c5bdf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.351240', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e6970d0-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': 'edf959db9ab02fb313f0c69647ae15094bcfd5cbfbbc43b6b635502b919918ad'}]}, 'timestamp': '2025-10-08 19:16:08.351666', '_unique_id': '9e48724af9cc43ea8e2d98899f32f6a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.352 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.bytes volume: 30534144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4afe885-ebc6-4093-a691-45f21730c944', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30534144, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.352763', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e69a53c-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': 'd1c9da380ffd3fcc4b7c8624e09b9ee156440b08dfdb73b7d99b89dd1a513ff6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.352763', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e69ad20-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.877558981, 'message_signature': 'd06543f1134237cb3af1ed1878c9bd95b54fc0bbb22cc6636721c045b1796522'}]}, 'timestamp': '2025-10-08 19:16:08.353197', '_unique_id': '2882f3270d8345ce80be62bc7da8ec1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.353 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.354 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.354 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.354 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d736ac4-516d-46f0-98ea-feac04a96ea3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-vda', 'timestamp': '2025-10-08T19:16:08.354283', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '3e69dfb6-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'fb850d79a02b462f269d981ba43243a205aae1a8f30b8dab7c79f630f7c0c8cc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': '34ca788d-2398-4a40-9f96-040c0849b18f-sda', 'timestamp': '2025-10-08T19:16:08.354283', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'instance-0000000d', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '3e69e74a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.954958523, 'message_signature': 'a7a04f422489f2dc533b6a48fc5164bee6bd81a5a60e91aa3bb10f93c6f58de7'}]}, 'timestamp': '2025-10-08 19:16:08.354687', '_unique_id': '6244346e4a1d44a298f29715442aa3a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.355 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f02e40c8-c860-49a5-a46a-28aa516f4d16', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.355764', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e6a1a3a-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '14552c990777bf9130f26c5beee75171d7c8ded823d76484fa5eff514e5ac3d7'}]}, 'timestamp': '2025-10-08 19:16:08.356006', '_unique_id': '83d2c77094d94ce1901f74004919dd70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.356 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 DEBUG ceilometer.compute.pollsters [-] 34ca788d-2398-4a40-9f96-040c0849b18f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2ca24662-01da-4cb8-b9b1-84d596be3cbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'efdb1424acdb478684cdb088b373ba05', 'user_name': None, 'project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'project_name': None, 'resource_id': 'instance-0000000d-34ca788d-2398-4a40-9f96-040c0849b18f-tap06998e1e-8c', 'timestamp': '2025-10-08T19:16:08.357073', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-91776509', 'name': 'tap06998e1e-8c', 'instance_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'instance_type': 'm1.nano', 'host': 'f3acba7c057f2e366142857700318e6a3581af3bf9975f50a1bf7805', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'e8a148fc-4419-4813-98ff-a17e2a95609e', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '23cfa426-7011-4566-992d-1c7af39f70dd'}, 'image_ref': '23cfa426-7011-4566-992d-1c7af39f70dd', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:66:c0:df', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap06998e1e-8c'}, 'message_id': '3e6a4ca8-a47b-11f0-a3bc-fa163e22ef71', 'monotonic_time': 1638.908671829, 'message_signature': '6691dbfe056510ebc14d82f7073dc37d9bdf3ea0c2629b5c8bf14528f3623316'}]}, 'timestamp': '2025-10-08 19:16:08.357300', '_unique_id': '3ecff87bcd0b42c2b65d18acbad87a7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     yield
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.357 12 ERROR oslo_messaging.notify.messaging 
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.358 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.358 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  8 19:16:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:16:08.358 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-91776509>]
Oct  8 19:16:08 compute-0 nova_compute[117514]: 2025-10-08 19:16:08.416 2 INFO nova.compute.manager [None req-7e70d0df-0954-4234-a0c8-1d8584090420 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Get console output#033[00m
Oct  8 19:16:08 compute-0 nova_compute[117514]: 2025-10-08 19:16:08.424 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:16:08 compute-0 podman[151060]: 2025-10-08 19:16:08.656112657 +0000 UTC m=+0.070578416 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 19:16:08 compute-0 podman[151062]: 2025-10-08 19:16:08.689414697 +0000 UTC m=+0.086812814 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 19:16:08 compute-0 podman[151061]: 2025-10-08 19:16:08.705737878 +0000 UTC m=+0.110314532 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  8 19:16:08 compute-0 nova_compute[117514]: 2025-10-08 19:16:08.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:09 compute-0 NetworkManager[1035]: <info>  [1759950969.5446] manager: (patch-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Oct  8 19:16:09 compute-0 NetworkManager[1035]: <info>  [1759950969.5463] manager: (patch-br-int-to-provnet-64c51c9c-a066-44c7-bc3d-9c8bcfc2a465): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Oct  8 19:16:09 compute-0 ovn_controller[19759]: 2025-10-08T19:16:09Z|00163|binding|INFO|Releasing lport 58bfd3a1-f863-472a-ae8b-afc52524c7cc from this chassis (sb_readonly=0)
Oct  8 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.802 2 INFO nova.compute.manager [None req-5300206d-4729-458f-b7a9-d3b3ebee095e efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Get console output#033[00m
Oct  8 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.807 54 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  8 19:16:09 compute-0 nova_compute[117514]: 2025-10-08 19:16:09.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.481 2 DEBUG nova.compute.manager [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.482 2 DEBUG nova.compute.manager [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing instance network info cache due to event network-changed-06998e1e-8ce7-484d-b3e4-7d44699229c4. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.482 2 DEBUG oslo_concurrency.lockutils [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.483 2 DEBUG oslo_concurrency.lockutils [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquired lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.483 2 DEBUG nova.network.neutron [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Refreshing network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.521 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.522 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.523 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.523 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.523 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.524 2 INFO nova.compute.manager [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Terminating instance#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.526 2 DEBUG nova.compute.manager [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  8 19:16:10 compute-0 kernel: tap06998e1e-8c (unregistering): left promiscuous mode
Oct  8 19:16:10 compute-0 NetworkManager[1035]: <info>  [1759950970.5523] device (tap06998e1e-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  8 19:16:10 compute-0 ovn_controller[19759]: 2025-10-08T19:16:10Z|00164|binding|INFO|Releasing lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 from this chassis (sb_readonly=0)
Oct  8 19:16:10 compute-0 ovn_controller[19759]: 2025-10-08T19:16:10Z|00165|binding|INFO|Setting lport 06998e1e-8ce7-484d-b3e4-7d44699229c4 down in Southbound
Oct  8 19:16:10 compute-0 ovn_controller[19759]: 2025-10-08T19:16:10Z|00166|binding|INFO|Removing iface tap06998e1e-8c ovn-installed in OVS
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.580 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:c0:df 10.100.0.6'], port_security=['fa:16:3e:66:c0:df 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '34ca788d-2398-4a40-9f96-040c0849b18f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ed492f30-88ab-4074-a37b-2efd9113a46f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7f7c752a9c5498f8eda73e461895ac9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0aa04153-3da7-40f5-b74d-f2ebacf56fd3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d714e95d-17df-46e0-aa89-985c7cbd12a3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>], logical_port=06998e1e-8ce7-484d-b3e4-7d44699229c4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f9cb2811e50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.584 28643 INFO neutron.agent.ovn.metadata.agent [-] Port 06998e1e-8ce7-484d-b3e4-7d44699229c4 in datapath ed492f30-88ab-4074-a37b-2efd9113a46f unbound from our chassis#033[00m
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.586 28643 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ed492f30-88ab-4074-a37b-2efd9113a46f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.588 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[50f3775a-0542-48bf-8062-59cc90e8b7c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.589 28643 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f namespace which is not needed anymore#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Oct  8 19:16:10 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.617s CPU time.
Oct  8 19:16:10 compute-0 systemd-machined[77568]: Machine qemu-13-instance-0000000d terminated.
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 podman[151138]: 2025-10-08 19:16:10.793898658 +0000 UTC m=+0.100222571 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:16:10 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [NOTICE]   (150997) : haproxy version is 2.8.14-c23fe91
Oct  8 19:16:10 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [NOTICE]   (150997) : path to executable is /usr/sbin/haproxy
Oct  8 19:16:10 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [ALERT]    (150997) : Current worker (150999) exited with code 143 (Terminated)
Oct  8 19:16:10 compute-0 neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f[150993]: [WARNING]  (150997) : All workers exited. Exiting... (0)
Oct  8 19:16:10 compute-0 systemd[1]: libpod-646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2.scope: Deactivated successfully.
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.804 2 INFO nova.virt.libvirt.driver [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Instance destroyed successfully.#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.805 2 DEBUG nova.objects.instance [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lazy-loading 'resources' on Instance uuid 34ca788d-2398-4a40-9f96-040c0849b18f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  8 19:16:10 compute-0 podman[151156]: 2025-10-08 19:16:10.81064566 +0000 UTC m=+0.068543187 container died 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.825 2 DEBUG nova.virt.libvirt.vif [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-08T19:15:40Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-91776509',display_name='tempest-TestNetworkBasicOps-server-91776509',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-91776509',id=13,image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBI8PTrGv1QybFIubtsg8lczGea0IvQL8pvhihemAZSj0UMnf1scRH00KmJvAMVhcwpSfJBSsSB9h8z57cU6NeYho/jEOEiMidDlTZU4qxsLiPufykBInXUSkP3hGqOiJaw==',key_name='tempest-TestNetworkBasicOps-916834063',keypairs=<?>,launch_index=0,launched_at=2025-10-08T19:15:46Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b7f7c752a9c5498f8eda73e461895ac9',ramdisk_id='',reservation_id='r-q8bisj0t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='23cfa426-7011-4566-992d-1c7af39f70dd',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1122149477',owner_user_name='tempest-TestNetworkBasicOps-1122149477-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-08T19:15:46Z,user_data=None,user_id='efdb1424acdb478684cdb088b373ba05',uuid=34ca788d-2398-4a40-9f96-040c0849b18f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.826 2 DEBUG nova.network.os_vif_util [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converting VIF {"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.826 2 DEBUG nova.network.os_vif_util [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.827 2 DEBUG os_vif [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.829 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap06998e1e-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.838 2 INFO os_vif [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:66:c0:df,bridge_name='br-int',has_traffic_filtering=True,id=06998e1e-8ce7-484d-b3e4-7d44699229c4,network=Network(ed492f30-88ab-4074-a37b-2efd9113a46f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap06998e1e-8c')#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.839 2 INFO nova.virt.libvirt.driver [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Deleting instance files /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f_del#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.839 2 INFO nova.virt.libvirt.driver [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Deletion of /var/lib/nova/instances/34ca788d-2398-4a40-9f96-040c0849b18f_del complete#033[00m
Oct  8 19:16:10 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2-userdata-shm.mount: Deactivated successfully.
Oct  8 19:16:10 compute-0 systemd[1]: var-lib-containers-storage-overlay-bfed095f12a30ca38c2adabcd7edcb3837429b20db6e6f549181366129732c0f-merged.mount: Deactivated successfully.
Oct  8 19:16:10 compute-0 podman[151156]: 2025-10-08 19:16:10.87442598 +0000 UTC m=+0.132323467 container cleanup 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.884 2 INFO nova.compute.manager [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.884 2 DEBUG oslo.service.loopingcall [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.884 2 DEBUG nova.compute.manager [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.885 2 DEBUG nova.network.neutron [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  8 19:16:10 compute-0 systemd[1]: libpod-conmon-646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2.scope: Deactivated successfully.
Oct  8 19:16:10 compute-0 podman[151185]: 2025-10-08 19:16:10.909732478 +0000 UTC m=+0.099954153 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  8 19:16:10 compute-0 podman[151194]: 2025-10-08 19:16:10.939157126 +0000 UTC m=+0.120342061 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:16:10 compute-0 podman[151239]: 2025-10-08 19:16:10.952885242 +0000 UTC m=+0.051585709 container remove 646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.959 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[ec551f93-a670-45bc-9b6f-75c994d318ed]: (4, ('Wed Oct  8 07:16:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f (646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2)\n646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2\nWed Oct  8 07:16:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f (646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2)\n646666533e733350053c63bfd3d1430cc9402ce93fda3a474bbeecdcbd537ad2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.961 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[3c05792e-9db9-4474-9e7f-753ed98e7909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.962 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=taped492f30-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 kernel: taped492f30-80: left promiscuous mode
Oct  8 19:16:10 compute-0 nova_compute[117514]: 2025-10-08 19:16:10.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:10 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:10.984 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[787aa40e-35b9-4dfc-85bc-51cc88d2fb7a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.025 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[fbac3123-a210-4de3-b7c6-e0deff758349]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.034 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[f2999cc5-91d4-4680-83c2-0ba2a66c3e4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.054 144726 DEBUG oslo.privsep.daemon [-] privsep: reply[b8f812bb-8a32-4f02-a448-115a73cc6674]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 161626, 'reachable_time': 44684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 151265, 'error': None, 'target': 'ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.057 28783 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ed492f30-88ab-4074-a37b-2efd9113a46f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  8 19:16:11 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:11.057 28783 DEBUG oslo.privsep.daemon [-] privsep: reply[54c88c11-2272-4fab-a658-441017e0f122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  8 19:16:11 compute-0 systemd[1]: run-netns-ovnmeta\x2ded492f30\x2d88ab\x2d4074\x2da37b\x2d2efd9113a46f.mount: Deactivated successfully.
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.722 2 DEBUG nova.network.neutron [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.746 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.747 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.748 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.749 2 INFO nova.compute.manager [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Took 0.86 seconds to deallocate network for instance.#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.794 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.795 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.900 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Refreshing inventories for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.948 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Updating ProviderTree inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.950 2 DEBUG nova.compute.provider_tree [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 19:16:11 compute-0 nova_compute[117514]: 2025-10-08 19:16:11.997 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Refreshing aggregate associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.019 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Refreshing trait associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.066 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.067 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6072MB free_disk=73.41375732421875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.068 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.072 2 DEBUG nova.compute.provider_tree [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.088 2 DEBUG nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.115 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.119 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.140 2 INFO nova.scheduler.client.report [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Deleted allocations for instance 34ca788d-2398-4a40-9f96-040c0849b18f#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.183 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.184 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.199 2 DEBUG nova.network.neutron [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updated VIF entry in instance network info cache for port 06998e1e-8ce7-484d-b3e4-7d44699229c4. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.200 2 DEBUG nova.network.neutron [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Updating instance_info_cache with network_info: [{"id": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "address": "fa:16:3e:66:c0:df", "network": {"id": "ed492f30-88ab-4074-a37b-2efd9113a46f", "bridge": "br-int", "label": "tempest-network-smoke--1871519036", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b7f7c752a9c5498f8eda73e461895ac9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap06998e1e-8c", "ovs_interfaceid": "06998e1e-8ce7-484d-b3e4-7d44699229c4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.210 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.228 2 DEBUG oslo_concurrency.lockutils [None req-717057f0-4d5b-4ceb-ba48-61ca033c4390 efdb1424acdb478684cdb088b373ba05 b7f7c752a9c5498f8eda73e461895ac9 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.230 2 DEBUG oslo_concurrency.lockutils [req-22c2d281-ede0-4d3d-b098-ae60029ca9f2 req-5634af68-6737-47e9-a68f-d3ab6250645b bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Releasing lock "refresh_cache-34ca788d-2398-4a40-9f96-040c0849b18f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.231 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.250 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.250 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.554 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-unplugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.555 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.555 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.556 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.556 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] No waiting events found dispatching network-vif-unplugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.557 2 WARNING nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received unexpected event network-vif-unplugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.557 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.558 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Acquiring lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.558 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.559 2 DEBUG oslo_concurrency.lockutils [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] Lock "34ca788d-2398-4a40-9f96-040c0849b18f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.559 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] No waiting events found dispatching network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.559 2 WARNING nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received unexpected event network-vif-plugged-06998e1e-8ce7-484d-b3e4-7d44699229c4 for instance with vm_state deleted and task_state None.#033[00m
Oct  8 19:16:12 compute-0 nova_compute[117514]: 2025-10-08 19:16:12.560 2 DEBUG nova.compute.manager [req-3a1eb7ba-716b-4e53-b280-1e2be26a415a req-6b33ecbc-e6e4-4629-92dc-935fea6f8c40 bd203f3f2df64e0380402a7598bf59c5 07729a10f89d452d9d405b605b02a687 - - default default] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Received event network-vif-deleted-06998e1e-8ce7-484d-b3e4-7d44699229c4 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  8 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.251 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.252 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.273 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.274 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.274 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:13 compute-0 nova_compute[117514]: 2025-10-08 19:16:13.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:14 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:14.651 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:16:14 compute-0 nova_compute[117514]: 2025-10-08 19:16:14.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:14 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:14.654 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:16:14 compute-0 nova_compute[117514]: 2025-10-08 19:16:14.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:14 compute-0 nova_compute[117514]: 2025-10-08 19:16:14.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 19:16:15 compute-0 nova_compute[117514]: 2025-10-08 19:16:15.732 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:15 compute-0 nova_compute[117514]: 2025-10-08 19:16:15.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:16 compute-0 nova_compute[117514]: 2025-10-08 19:16:16.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:18 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:18.655 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:16:19 compute-0 nova_compute[117514]: 2025-10-08 19:16:19.729 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:16:19 compute-0 nova_compute[117514]: 2025-10-08 19:16:19.729 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 19:16:19 compute-0 nova_compute[117514]: 2025-10-08 19:16:19.747 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 19:16:20 compute-0 nova_compute[117514]: 2025-10-08 19:16:20.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:21 compute-0 nova_compute[117514]: 2025-10-08 19:16:21.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:23 compute-0 podman[151268]: 2025-10-08 19:16:23.665134398 +0000 UTC m=+0.083793687 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 19:16:25 compute-0 nova_compute[117514]: 2025-10-08 19:16:25.803 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759950970.8029397, 34ca788d-2398-4a40-9f96-040c0849b18f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  8 19:16:25 compute-0 nova_compute[117514]: 2025-10-08 19:16:25.804 2 INFO nova.compute.manager [-] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] VM Stopped (Lifecycle Event)#033[00m
Oct  8 19:16:25 compute-0 nova_compute[117514]: 2025-10-08 19:16:25.834 2 DEBUG nova.compute.manager [None req-a9321ac2-9f92-477a-907c-9a01fc35389e - - - - - -] [instance: 34ca788d-2398-4a40-9f96-040c0849b18f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  8 19:16:25 compute-0 nova_compute[117514]: 2025-10-08 19:16:25.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:26 compute-0 nova_compute[117514]: 2025-10-08 19:16:26.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:30 compute-0 nova_compute[117514]: 2025-10-08 19:16:30.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:31 compute-0 nova_compute[117514]: 2025-10-08 19:16:31.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:33 compute-0 podman[151292]: 2025-10-08 19:16:33.677362214 +0000 UTC m=+0.094950860 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:16:35 compute-0 nova_compute[117514]: 2025-10-08 19:16:35.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:36 compute-0 nova_compute[117514]: 2025-10-08 19:16:36.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:39 compute-0 podman[151314]: 2025-10-08 19:16:39.692885232 +0000 UTC m=+0.091094958 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:16:39 compute-0 podman[151313]: 2025-10-08 19:16:39.69318219 +0000 UTC m=+0.097769080 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:16:39 compute-0 podman[151312]: 2025-10-08 19:16:39.704019933 +0000 UTC m=+0.112703091 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, distribution-scope=public, version=9.6, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41)
Oct  8 19:16:40 compute-0 nova_compute[117514]: 2025-10-08 19:16:40.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:41 compute-0 nova_compute[117514]: 2025-10-08 19:16:41.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:41 compute-0 podman[151375]: 2025-10-08 19:16:41.665412406 +0000 UTC m=+0.072727328 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  8 19:16:41 compute-0 podman[151373]: 2025-10-08 19:16:41.685805344 +0000 UTC m=+0.093350193 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Oct  8 19:16:41 compute-0 podman[151374]: 2025-10-08 19:16:41.718149336 +0000 UTC m=+0.120886366 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  8 19:16:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:44.235 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:16:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:16:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:16:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:16:45 compute-0 nova_compute[117514]: 2025-10-08 19:16:45.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:46 compute-0 nova_compute[117514]: 2025-10-08 19:16:46.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:47 compute-0 ovn_controller[19759]: 2025-10-08T19:16:47Z|00167|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Oct  8 19:16:50 compute-0 nova_compute[117514]: 2025-10-08 19:16:50.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:51 compute-0 nova_compute[117514]: 2025-10-08 19:16:51.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:54 compute-0 podman[151432]: 2025-10-08 19:16:54.68241522 +0000 UTC m=+0.099947464 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:16:55 compute-0 nova_compute[117514]: 2025-10-08 19:16:55.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:16:56 compute-0 nova_compute[117514]: 2025-10-08 19:16:56.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:00 compute-0 nova_compute[117514]: 2025-10-08 19:17:00.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:01 compute-0 nova_compute[117514]: 2025-10-08 19:17:01.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:03 compute-0 systemd[1]: Created slice User Slice of UID 1000.
Oct  8 19:17:03 compute-0 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  8 19:17:03 compute-0 systemd-logind[844]: New session 12 of user zuul.
Oct  8 19:17:03 compute-0 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  8 19:17:03 compute-0 systemd[1]: Starting User Manager for UID 1000...
Oct  8 19:17:03 compute-0 podman[151459]: 2025-10-08 19:17:03.865987464 +0000 UTC m=+0.118899790 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:17:04 compute-0 systemd[151478]: Queued start job for default target Main User Target.
Oct  8 19:17:04 compute-0 systemd[151478]: Created slice User Application Slice.
Oct  8 19:17:04 compute-0 systemd[151478]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 19:17:04 compute-0 systemd[151478]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 19:17:04 compute-0 systemd[151478]: Reached target Paths.
Oct  8 19:17:04 compute-0 systemd[151478]: Reached target Timers.
Oct  8 19:17:04 compute-0 systemd[151478]: Starting D-Bus User Message Bus Socket...
Oct  8 19:17:04 compute-0 systemd[151478]: Starting Create User's Volatile Files and Directories...
Oct  8 19:17:04 compute-0 systemd[151478]: Listening on D-Bus User Message Bus Socket.
Oct  8 19:17:04 compute-0 systemd[151478]: Reached target Sockets.
Oct  8 19:17:04 compute-0 systemd[151478]: Finished Create User's Volatile Files and Directories.
Oct  8 19:17:04 compute-0 systemd[151478]: Reached target Basic System.
Oct  8 19:17:04 compute-0 systemd[151478]: Reached target Main User Target.
Oct  8 19:17:04 compute-0 systemd[151478]: Startup finished in 182ms.
Oct  8 19:17:04 compute-0 systemd[1]: Started User Manager for UID 1000.
Oct  8 19:17:04 compute-0 systemd[1]: Started Session 12 of User zuul.
Oct  8 19:17:05 compute-0 nova_compute[117514]: 2025-10-08 19:17:05.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:06 compute-0 nova_compute[117514]: 2025-10-08 19:17:06.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:09 compute-0 ovs-vsctl[151670]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  8 19:17:09 compute-0 nova_compute[117514]: 2025-10-08 19:17:09.735 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:09 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 151521 (sos)
Oct  8 19:17:09 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct  8 19:17:09 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct  8 19:17:10 compute-0 podman[151718]: 2025-10-08 19:17:10.075299672 +0000 UTC m=+0.091547157 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Oct  8 19:17:10 compute-0 podman[151720]: 2025-10-08 19:17:10.075650702 +0000 UTC m=+0.097965662 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  8 19:17:10 compute-0 podman[151717]: 2025-10-08 19:17:10.081982224 +0000 UTC m=+0.104139239 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Oct  8 19:17:10 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  8 19:17:10 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  8 19:17:10 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  8 19:17:10 compute-0 nova_compute[117514]: 2025-10-08 19:17:10.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:11 compute-0 kernel: block sr0: the capability attribute has been deprecated.
Oct  8 19:17:11 compute-0 nova_compute[117514]: 2025-10-08 19:17:11.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:12 compute-0 podman[152210]: 2025-10-08 19:17:12.66075791 +0000 UTC m=+0.067910185 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 19:17:12 compute-0 podman[152206]: 2025-10-08 19:17:12.706280781 +0000 UTC m=+0.116204526 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.715 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.716 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.716 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:17:12 compute-0 podman[152209]: 2025-10-08 19:17:12.725049231 +0000 UTC m=+0.134495962 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.945 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.945 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:12 compute-0 nova_compute[117514]: 2025-10-08 19:17:12.945 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.746 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.746 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.746 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.879 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.880 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5839MB free_disk=73.2657241821289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.880 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.880 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.951 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.952 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.973 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.987 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.989 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:17:13 compute-0 nova_compute[117514]: 2025-10-08 19:17:13.989 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:17:14 compute-0 systemd[1]: Starting Hostname Service...
Oct  8 19:17:14 compute-0 systemd[1]: Started Hostname Service.
Oct  8 19:17:15 compute-0 nova_compute[117514]: 2025-10-08 19:17:15.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:15 compute-0 nova_compute[117514]: 2025-10-08 19:17:15.989 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:16 compute-0 nova_compute[117514]: 2025-10-08 19:17:16.593 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:17 compute-0 nova_compute[117514]: 2025-10-08 19:17:17.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:17 compute-0 nova_compute[117514]: 2025-10-08 19:17:17.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:18 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct  8 19:17:18 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct  8 19:17:18 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct  8 19:17:18 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct  8 19:17:18 compute-0 kernel: cfg80211: failed to load regulatory.db
Oct  8 19:17:18 compute-0 nova_compute[117514]: 2025-10-08 19:17:18.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:17:20 compute-0 ovs-appctl[153276]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  8 19:17:20 compute-0 ovs-appctl[153290]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  8 19:17:20 compute-0 ovs-appctl[153295]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  8 19:17:20 compute-0 nova_compute[117514]: 2025-10-08 19:17:20.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:21 compute-0 nova_compute[117514]: 2025-10-08 19:17:21.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:24 compute-0 podman[154284]: 2025-10-08 19:17:24.892200246 +0000 UTC m=+0.068270127 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  8 19:17:25 compute-0 nova_compute[117514]: 2025-10-08 19:17:25.972 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:26 compute-0 nova_compute[117514]: 2025-10-08 19:17:26.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:28 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  8 19:17:30 compute-0 nova_compute[117514]: 2025-10-08 19:17:30.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:31 compute-0 systemd[1]: Starting Time & Date Service...
Oct  8 19:17:31 compute-0 systemd[1]: Started Time & Date Service.
Oct  8 19:17:31 compute-0 nova_compute[117514]: 2025-10-08 19:17:31.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:34 compute-0 podman[154842]: 2025-10-08 19:17:34.518764273 +0000 UTC m=+0.093663537 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:17:35 compute-0 nova_compute[117514]: 2025-10-08 19:17:35.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:36 compute-0 nova_compute[117514]: 2025-10-08 19:17:36.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:40 compute-0 podman[154866]: 2025-10-08 19:17:40.705336351 +0000 UTC m=+0.092544885 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:17:40 compute-0 podman[154865]: 2025-10-08 19:17:40.715901646 +0000 UTC m=+0.113561351 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  8 19:17:40 compute-0 podman[154864]: 2025-10-08 19:17:40.740295138 +0000 UTC m=+0.140784754 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Oct  8 19:17:40 compute-0 nova_compute[117514]: 2025-10-08 19:17:40.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:41 compute-0 nova_compute[117514]: 2025-10-08 19:17:41.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:43 compute-0 podman[154927]: 2025-10-08 19:17:43.721645274 +0000 UTC m=+0.135514503 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:17:43 compute-0 podman[154929]: 2025-10-08 19:17:43.731336553 +0000 UTC m=+0.132879897 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  8 19:17:43 compute-0 podman[154928]: 2025-10-08 19:17:43.749047443 +0000 UTC m=+0.159050911 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  8 19:17:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:17:44.236 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:17:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:17:44.237 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:17:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:17:44.237 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:17:45 compute-0 nova_compute[117514]: 2025-10-08 19:17:45.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:46 compute-0 nova_compute[117514]: 2025-10-08 19:17:46.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:48 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Oct  8 19:17:48 compute-0 systemd[1]: session-12.scope: Consumed 1min 16.025s CPU time, 580.3M memory peak, read 172.9M from disk, written 17.6M to disk.
Oct  8 19:17:48 compute-0 systemd-logind[844]: Session 12 logged out. Waiting for processes to exit.
Oct  8 19:17:48 compute-0 systemd-logind[844]: Removed session 12.
Oct  8 19:17:48 compute-0 systemd-logind[844]: New session 14 of user zuul.
Oct  8 19:17:48 compute-0 systemd[1]: Started Session 14 of User zuul.
Oct  8 19:17:49 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Oct  8 19:17:49 compute-0 systemd-logind[844]: Session 14 logged out. Waiting for processes to exit.
Oct  8 19:17:49 compute-0 systemd-logind[844]: Removed session 14.
Oct  8 19:17:49 compute-0 systemd-logind[844]: New session 15 of user zuul.
Oct  8 19:17:49 compute-0 systemd[1]: Started Session 15 of User zuul.
Oct  8 19:17:49 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Oct  8 19:17:49 compute-0 systemd-logind[844]: Session 15 logged out. Waiting for processes to exit.
Oct  8 19:17:49 compute-0 systemd-logind[844]: Removed session 15.
Oct  8 19:17:50 compute-0 nova_compute[117514]: 2025-10-08 19:17:50.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:51 compute-0 nova_compute[117514]: 2025-10-08 19:17:51.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:55 compute-0 podman[155051]: 2025-10-08 19:17:55.652448333 +0000 UTC m=+0.070412958 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:17:56 compute-0 nova_compute[117514]: 2025-10-08 19:17:56.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:56 compute-0 nova_compute[117514]: 2025-10-08 19:17:56.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:17:59 compute-0 systemd[1]: Stopping User Manager for UID 1000...
Oct  8 19:17:59 compute-0 systemd[151478]: Activating special unit Exit the Session...
Oct  8 19:17:59 compute-0 systemd[151478]: Stopped target Main User Target.
Oct  8 19:17:59 compute-0 systemd[151478]: Stopped target Basic System.
Oct  8 19:17:59 compute-0 systemd[151478]: Stopped target Paths.
Oct  8 19:17:59 compute-0 systemd[151478]: Stopped target Sockets.
Oct  8 19:17:59 compute-0 systemd[151478]: Stopped target Timers.
Oct  8 19:17:59 compute-0 systemd[151478]: Stopped Mark boot as successful after the user session has run 2 minutes.
Oct  8 19:17:59 compute-0 systemd[151478]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  8 19:17:59 compute-0 systemd[151478]: Closed D-Bus User Message Bus Socket.
Oct  8 19:17:59 compute-0 systemd[151478]: Stopped Create User's Volatile Files and Directories.
Oct  8 19:17:59 compute-0 systemd[151478]: Removed slice User Application Slice.
Oct  8 19:17:59 compute-0 systemd[151478]: Reached target Shutdown.
Oct  8 19:17:59 compute-0 systemd[151478]: Finished Exit the Session.
Oct  8 19:17:59 compute-0 systemd[151478]: Reached target Exit the Session.
Oct  8 19:17:59 compute-0 systemd[1]: user@1000.service: Deactivated successfully.
Oct  8 19:17:59 compute-0 systemd[1]: Stopped User Manager for UID 1000.
Oct  8 19:17:59 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/1000...
Oct  8 19:17:59 compute-0 systemd[1]: run-user-1000.mount: Deactivated successfully.
Oct  8 19:17:59 compute-0 systemd[1]: user-runtime-dir@1000.service: Deactivated successfully.
Oct  8 19:17:59 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/1000.
Oct  8 19:17:59 compute-0 systemd[1]: Removed slice User Slice of UID 1000.
Oct  8 19:17:59 compute-0 systemd[1]: user-1000.slice: Consumed 1min 16.666s CPU time, 585.7M memory peak, read 172.9M from disk, written 17.6M to disk.
Oct  8 19:18:01 compute-0 nova_compute[117514]: 2025-10-08 19:18:01.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:01 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  8 19:18:01 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  8 19:18:01 compute-0 nova_compute[117514]: 2025-10-08 19:18:01.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:04 compute-0 podman[155081]: 2025-10-08 19:18:04.685246087 +0000 UTC m=+0.106192319 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  8 19:18:06 compute-0 nova_compute[117514]: 2025-10-08 19:18:06.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:06 compute-0 nova_compute[117514]: 2025-10-08 19:18:06.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:18:08.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:18:10 compute-0 nova_compute[117514]: 2025-10-08 19:18:10.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:11 compute-0 nova_compute[117514]: 2025-10-08 19:18:11.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:11 compute-0 podman[155101]: 2025-10-08 19:18:11.645643603 +0000 UTC m=+0.063506999 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  8 19:18:11 compute-0 nova_compute[117514]: 2025-10-08 19:18:11.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:11 compute-0 podman[155102]: 2025-10-08 19:18:11.680011263 +0000 UTC m=+0.094111961 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0)
Oct  8 19:18:11 compute-0 podman[155103]: 2025-10-08 19:18:11.685434439 +0000 UTC m=+0.092039001 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.743 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.743 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:12 compute-0 nova_compute[117514]: 2025-10-08 19:18:12.744 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:18:14 compute-0 podman[155170]: 2025-10-08 19:18:14.677103462 +0000 UTC m=+0.077699168 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 19:18:14 compute-0 podman[155168]: 2025-10-08 19:18:14.677972217 +0000 UTC m=+0.090203428 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid)
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:14 compute-0 podman[155169]: 2025-10-08 19:18:14.716971179 +0000 UTC m=+0.124649169 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.748 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.748 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.749 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.749 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.967 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.968 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5988MB free_disk=73.40881729125977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.969 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:18:14 compute-0 nova_compute[117514]: 2025-10-08 19:18:14.969 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.060 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.061 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.082 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.096 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.098 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:18:15 compute-0 nova_compute[117514]: 2025-10-08 19:18:15.098 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:18:16 compute-0 nova_compute[117514]: 2025-10-08 19:18:16.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:16 compute-0 nova_compute[117514]: 2025-10-08 19:18:16.099 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:16 compute-0 nova_compute[117514]: 2025-10-08 19:18:16.100 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:16 compute-0 nova_compute[117514]: 2025-10-08 19:18:16.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:17 compute-0 nova_compute[117514]: 2025-10-08 19:18:17.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:18 compute-0 nova_compute[117514]: 2025-10-08 19:18:18.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:18:21 compute-0 nova_compute[117514]: 2025-10-08 19:18:21.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:21 compute-0 nova_compute[117514]: 2025-10-08 19:18:21.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:26 compute-0 nova_compute[117514]: 2025-10-08 19:18:26.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:26 compute-0 podman[155233]: 2025-10-08 19:18:26.658196619 +0000 UTC m=+0.073437796 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  8 19:18:26 compute-0 nova_compute[117514]: 2025-10-08 19:18:26.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:31 compute-0 nova_compute[117514]: 2025-10-08 19:18:31.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:31 compute-0 nova_compute[117514]: 2025-10-08 19:18:31.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:35 compute-0 podman[155262]: 2025-10-08 19:18:35.673259601 +0000 UTC m=+0.086478501 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:18:36 compute-0 nova_compute[117514]: 2025-10-08 19:18:36.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:36 compute-0 nova_compute[117514]: 2025-10-08 19:18:36.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:41 compute-0 nova_compute[117514]: 2025-10-08 19:18:41.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:41 compute-0 nova_compute[117514]: 2025-10-08 19:18:41.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:42 compute-0 podman[155290]: 2025-10-08 19:18:42.64138494 +0000 UTC m=+0.050302749 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:18:42 compute-0 podman[155288]: 2025-10-08 19:18:42.64556646 +0000 UTC m=+0.061250614 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 19:18:42 compute-0 podman[155289]: 2025-10-08 19:18:42.655316111 +0000 UTC m=+0.067691690 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  8 19:18:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:18:44.238 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:18:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:18:44.238 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:18:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:18:44.238 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:18:45 compute-0 podman[155357]: 2025-10-08 19:18:45.658186288 +0000 UTC m=+0.073183388 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:18:45 compute-0 podman[155359]: 2025-10-08 19:18:45.715910569 +0000 UTC m=+0.114983441 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  8 19:18:45 compute-0 podman[155358]: 2025-10-08 19:18:45.740849286 +0000 UTC m=+0.145863239 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:18:46 compute-0 nova_compute[117514]: 2025-10-08 19:18:46.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:46 compute-0 nova_compute[117514]: 2025-10-08 19:18:46.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:51 compute-0 nova_compute[117514]: 2025-10-08 19:18:51.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:51 compute-0 nova_compute[117514]: 2025-10-08 19:18:51.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:56 compute-0 nova_compute[117514]: 2025-10-08 19:18:56.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:56 compute-0 nova_compute[117514]: 2025-10-08 19:18:56.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:18:57 compute-0 podman[155422]: 2025-10-08 19:18:57.637736581 +0000 UTC m=+0.062801939 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:19:01 compute-0 nova_compute[117514]: 2025-10-08 19:19:01.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:01 compute-0 nova_compute[117514]: 2025-10-08 19:19:01.688 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:06 compute-0 nova_compute[117514]: 2025-10-08 19:19:06.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:06 compute-0 nova_compute[117514]: 2025-10-08 19:19:06.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:06 compute-0 podman[155446]: 2025-10-08 19:19:06.736614327 +0000 UTC m=+0.150165895 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  8 19:19:11 compute-0 nova_compute[117514]: 2025-10-08 19:19:11.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:11 compute-0 nova_compute[117514]: 2025-10-08 19:19:11.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:12 compute-0 nova_compute[117514]: 2025-10-08 19:19:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:13 compute-0 podman[155468]: 2025-10-08 19:19:13.692589966 +0000 UTC m=+0.094170752 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  8 19:19:13 compute-0 podman[155467]: 2025-10-08 19:19:13.697329243 +0000 UTC m=+0.102685098 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 19:19:13 compute-0 nova_compute[117514]: 2025-10-08 19:19:13.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:13 compute-0 podman[155466]: 2025-10-08 19:19:13.715218588 +0000 UTC m=+0.128801050 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, vcs-type=git, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  8 19:19:13 compute-0 nova_compute[117514]: 2025-10-08 19:19:13.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:13 compute-0 nova_compute[117514]: 2025-10-08 19:19:13.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.733 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.734 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.759 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.760 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.760 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.760 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.988 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.989 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6052MB free_disk=73.40883255004883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.989 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:19:14 compute-0 nova_compute[117514]: 2025-10-08 19:19:14.990 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.045 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.046 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.067 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.080 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.082 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:19:15 compute-0 nova_compute[117514]: 2025-10-08 19:19:15.082 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:19:16 compute-0 nova_compute[117514]: 2025-10-08 19:19:16.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:16 compute-0 podman[155532]: 2025-10-08 19:19:16.671760019 +0000 UTC m=+0.087052727 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible)
Oct  8 19:19:16 compute-0 podman[155534]: 2025-10-08 19:19:16.672336156 +0000 UTC m=+0.075552087 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 19:19:16 compute-0 podman[155533]: 2025-10-08 19:19:16.711575715 +0000 UTC m=+0.118503072 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Oct  8 19:19:16 compute-0 nova_compute[117514]: 2025-10-08 19:19:16.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:17 compute-0 nova_compute[117514]: 2025-10-08 19:19:17.065 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:17 compute-0 nova_compute[117514]: 2025-10-08 19:19:17.066 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:17 compute-0 nova_compute[117514]: 2025-10-08 19:19:17.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:18 compute-0 nova_compute[117514]: 2025-10-08 19:19:18.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:18 compute-0 nova_compute[117514]: 2025-10-08 19:19:18.740 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:19:21 compute-0 nova_compute[117514]: 2025-10-08 19:19:21.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:21 compute-0 nova_compute[117514]: 2025-10-08 19:19:21.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:26 compute-0 nova_compute[117514]: 2025-10-08 19:19:26.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:26 compute-0 nova_compute[117514]: 2025-10-08 19:19:26.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:28 compute-0 podman[155594]: 2025-10-08 19:19:28.656991313 +0000 UTC m=+0.070862620 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  8 19:19:31 compute-0 nova_compute[117514]: 2025-10-08 19:19:31.075 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:31 compute-0 nova_compute[117514]: 2025-10-08 19:19:31.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:36 compute-0 nova_compute[117514]: 2025-10-08 19:19:36.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:36 compute-0 nova_compute[117514]: 2025-10-08 19:19:36.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:37 compute-0 podman[155619]: 2025-10-08 19:19:37.678113388 +0000 UTC m=+0.095794478 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 19:19:41 compute-0 nova_compute[117514]: 2025-10-08 19:19:41.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:41 compute-0 nova_compute[117514]: 2025-10-08 19:19:41.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:19:44.239 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:19:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:19:44.239 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:19:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:19:44.240 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:19:44 compute-0 podman[155640]: 2025-10-08 19:19:44.6538235 +0000 UTC m=+0.075318728 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:19:44 compute-0 podman[155639]: 2025-10-08 19:19:44.654285553 +0000 UTC m=+0.079054845 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  8 19:19:44 compute-0 podman[155641]: 2025-10-08 19:19:44.675430842 +0000 UTC m=+0.087954002 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:19:46 compute-0 nova_compute[117514]: 2025-10-08 19:19:46.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:46 compute-0 nova_compute[117514]: 2025-10-08 19:19:46.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:47 compute-0 podman[155700]: 2025-10-08 19:19:47.66498048 +0000 UTC m=+0.073652630 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid)
Oct  8 19:19:47 compute-0 podman[155702]: 2025-10-08 19:19:47.678028166 +0000 UTC m=+0.072926790 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  8 19:19:47 compute-0 podman[155701]: 2025-10-08 19:19:47.78417096 +0000 UTC m=+0.189536545 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:19:51 compute-0 nova_compute[117514]: 2025-10-08 19:19:51.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:51 compute-0 nova_compute[117514]: 2025-10-08 19:19:51.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:56 compute-0 nova_compute[117514]: 2025-10-08 19:19:56.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:56 compute-0 nova_compute[117514]: 2025-10-08 19:19:56.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:19:59 compute-0 podman[155763]: 2025-10-08 19:19:59.657328366 +0000 UTC m=+0.064477526 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:20:01 compute-0 nova_compute[117514]: 2025-10-08 19:20:01.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:01 compute-0 nova_compute[117514]: 2025-10-08 19:20:01.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:06 compute-0 nova_compute[117514]: 2025-10-08 19:20:06.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:06 compute-0 nova_compute[117514]: 2025-10-08 19:20:06.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:20:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:20:08 compute-0 systemd[1]: Starting system activity accounting tool...
Oct  8 19:20:08 compute-0 systemd[1]: sysstat-collect.service: Deactivated successfully.
Oct  8 19:20:08 compute-0 systemd[1]: Finished system activity accounting tool.
Oct  8 19:20:08 compute-0 podman[155787]: 2025-10-08 19:20:08.65549773 +0000 UTC m=+0.077291965 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:20:11 compute-0 nova_compute[117514]: 2025-10-08 19:20:11.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:11 compute-0 nova_compute[117514]: 2025-10-08 19:20:11.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:12 compute-0 nova_compute[117514]: 2025-10-08 19:20:12.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:13 compute-0 nova_compute[117514]: 2025-10-08 19:20:13.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:13 compute-0 nova_compute[117514]: 2025-10-08 19:20:13.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:20:14 compute-0 nova_compute[117514]: 2025-10-08 19:20:14.712 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:15 compute-0 podman[155809]: 2025-10-08 19:20:15.677659099 +0000 UTC m=+0.084523853 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  8 19:20:15 compute-0 podman[155808]: 2025-10-08 19:20:15.677789963 +0000 UTC m=+0.088761805 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, version=9.6, config_id=edpm, distribution-scope=public)
Oct  8 19:20:15 compute-0 podman[155810]: 2025-10-08 19:20:15.678765891 +0000 UTC m=+0.080258121 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.745 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:20:15 compute-0 nova_compute[117514]: 2025-10-08 19:20:15.746 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.000 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.001 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6071MB free_disk=73.40880966186523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.002 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.002 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.066 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.067 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.088 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.105 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.107 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.108 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:16 compute-0 nova_compute[117514]: 2025-10-08 19:20:16.815 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.108 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.109 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.109 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.122 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.123 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:17 compute-0 nova_compute[117514]: 2025-10-08 19:20:17.124 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:18 compute-0 podman[155870]: 2025-10-08 19:20:18.674842418 +0000 UTC m=+0.089259969 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:20:18 compute-0 podman[155872]: 2025-10-08 19:20:18.676254539 +0000 UTC m=+0.079884560 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Oct  8 19:20:18 compute-0 nova_compute[117514]: 2025-10-08 19:20:18.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:18 compute-0 podman[155871]: 2025-10-08 19:20:18.721346326 +0000 UTC m=+0.130915708 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:20:19 compute-0 nova_compute[117514]: 2025-10-08 19:20:19.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:20:21 compute-0 nova_compute[117514]: 2025-10-08 19:20:21.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:21 compute-0 nova_compute[117514]: 2025-10-08 19:20:21.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:26 compute-0 nova_compute[117514]: 2025-10-08 19:20:26.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:26 compute-0 nova_compute[117514]: 2025-10-08 19:20:26.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:30 compute-0 podman[155931]: 2025-10-08 19:20:30.663037704 +0000 UTC m=+0.076615015 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:20:31 compute-0 nova_compute[117514]: 2025-10-08 19:20:31.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:31 compute-0 nova_compute[117514]: 2025-10-08 19:20:31.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:36 compute-0 nova_compute[117514]: 2025-10-08 19:20:36.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:36 compute-0 nova_compute[117514]: 2025-10-08 19:20:36.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:39 compute-0 podman[155955]: 2025-10-08 19:20:39.67824362 +0000 UTC m=+0.090950738 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  8 19:20:41 compute-0 nova_compute[117514]: 2025-10-08 19:20:41.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:41 compute-0 nova_compute[117514]: 2025-10-08 19:20:41.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:20:44.240 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:20:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:20:44.240 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:20:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:20:44.240 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:20:46 compute-0 nova_compute[117514]: 2025-10-08 19:20:46.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:46 compute-0 podman[155976]: 2025-10-08 19:20:46.65563922 +0000 UTC m=+0.069838470 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:20:46 compute-0 podman[155975]: 2025-10-08 19:20:46.664025182 +0000 UTC m=+0.083238946 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, container_name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  8 19:20:46 compute-0 podman[155977]: 2025-10-08 19:20:46.667297686 +0000 UTC m=+0.066529005 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:20:46 compute-0 nova_compute[117514]: 2025-10-08 19:20:46.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:49 compute-0 podman[156038]: 2025-10-08 19:20:49.654893088 +0000 UTC m=+0.079531439 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  8 19:20:49 compute-0 podman[156040]: 2025-10-08 19:20:49.664800143 +0000 UTC m=+0.074269918 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:20:49 compute-0 podman[156039]: 2025-10-08 19:20:49.728466545 +0000 UTC m=+0.138951339 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 19:20:51 compute-0 nova_compute[117514]: 2025-10-08 19:20:51.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:51 compute-0 nova_compute[117514]: 2025-10-08 19:20:51.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:56 compute-0 nova_compute[117514]: 2025-10-08 19:20:56.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:20:56 compute-0 nova_compute[117514]: 2025-10-08 19:20:56.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:01 compute-0 nova_compute[117514]: 2025-10-08 19:21:01.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:01 compute-0 podman[156100]: 2025-10-08 19:21:01.670172006 +0000 UTC m=+0.090460694 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:21:01 compute-0 nova_compute[117514]: 2025-10-08 19:21:01.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:01 compute-0 nova_compute[117514]: 2025-10-08 19:21:01.933 2 DEBUG oslo_concurrency.processutils [None req-726d2452-5976-48e6-a261-8201e24bb8bf 4109eb10f1504d00848780f1ed22af42 0776a2a010754884a7b224f3b08ef53b - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  8 19:21:01 compute-0 nova_compute[117514]: 2025-10-08 19:21:01.957 2 DEBUG oslo_concurrency.processutils [None req-726d2452-5976-48e6-a261-8201e24bb8bf 4109eb10f1504d00848780f1ed22af42 0776a2a010754884a7b224f3b08ef53b - - default default] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  8 19:21:06 compute-0 nova_compute[117514]: 2025-10-08 19:21:06.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:06 compute-0 nova_compute[117514]: 2025-10-08 19:21:06.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:07 compute-0 nova_compute[117514]: 2025-10-08 19:21:07.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:07 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:07.517 28643 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:75:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': '5e:14:dd:63:55:2a'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  8 19:21:07 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:07.518 28643 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  8 19:21:10 compute-0 podman[156125]: 2025-10-08 19:21:10.658190987 +0000 UTC m=+0.085357227 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Oct  8 19:21:11 compute-0 nova_compute[117514]: 2025-10-08 19:21:11.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:11 compute-0 nova_compute[117514]: 2025-10-08 19:21:11.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:13 compute-0 nova_compute[117514]: 2025-10-08 19:21:13.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:14 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:14.522 28643 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=47f81f7a-64d8-418a-a74c-b879bd6deb83, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  8 19:21:14 compute-0 nova_compute[117514]: 2025-10-08 19:21:14.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.717 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.756 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.757 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.757 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.758 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.994 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.995 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6079MB free_disk=73.40882873535156GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.995 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:21:15 compute-0 nova_compute[117514]: 2025-10-08 19:21:15.996 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.300 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.301 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.370 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing inventories for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.434 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating ProviderTree inventory for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.435 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Updating inventory in ProviderTree for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.451 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing aggregate associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.504 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Refreshing trait associations for resource provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_DEVICE_TAGGING,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE2,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_MMX,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_ACCELERATORS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_SSE42,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SVM,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NODE,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_ABM,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_BMI,HW_CPU_X86_SSE41,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_PCNET _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.533 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.567 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.569 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.570 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:21:16 compute-0 nova_compute[117514]: 2025-10-08 19:21:16.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:17 compute-0 podman[156147]: 2025-10-08 19:21:17.66417527 +0000 UTC m=+0.087676604 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41)
Oct  8 19:21:17 compute-0 podman[156148]: 2025-10-08 19:21:17.675213357 +0000 UTC m=+0.061566301 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  8 19:21:17 compute-0 podman[156149]: 2025-10-08 19:21:17.707067084 +0000 UTC m=+0.082978618 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.570 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.570 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.570 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.595 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.596 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:18 compute-0 nova_compute[117514]: 2025-10-08 19:21:18.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:20 compute-0 podman[156210]: 2025-10-08 19:21:20.682965179 +0000 UTC m=+0.090380641 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3)
Oct  8 19:21:20 compute-0 podman[156212]: 2025-10-08 19:21:20.686507091 +0000 UTC m=+0.073515956 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:21:20 compute-0 podman[156211]: 2025-10-08 19:21:20.710752579 +0000 UTC m=+0.117721339 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:21:20 compute-0 nova_compute[117514]: 2025-10-08 19:21:20.712 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:20 compute-0 nova_compute[117514]: 2025-10-08 19:21:20.728 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:20 compute-0 nova_compute[117514]: 2025-10-08 19:21:20.728 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  8 19:21:20 compute-0 nova_compute[117514]: 2025-10-08 19:21:20.741 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  8 19:21:21 compute-0 nova_compute[117514]: 2025-10-08 19:21:21.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:21 compute-0 nova_compute[117514]: 2025-10-08 19:21:21.731 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:21 compute-0 nova_compute[117514]: 2025-10-08 19:21:21.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:22 compute-0 nova_compute[117514]: 2025-10-08 19:21:22.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:26 compute-0 nova_compute[117514]: 2025-10-08 19:21:26.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:26 compute-0 nova_compute[117514]: 2025-10-08 19:21:26.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:27 compute-0 nova_compute[117514]: 2025-10-08 19:21:27.734 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:27 compute-0 nova_compute[117514]: 2025-10-08 19:21:27.735 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  8 19:21:31 compute-0 nova_compute[117514]: 2025-10-08 19:21:31.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:31 compute-0 nova_compute[117514]: 2025-10-08 19:21:31.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:32 compute-0 podman[156270]: 2025-10-08 19:21:32.6312924 +0000 UTC m=+0.053182341 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:21:34 compute-0 nova_compute[117514]: 2025-10-08 19:21:34.131 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:21:36 compute-0 nova_compute[117514]: 2025-10-08 19:21:36.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:36 compute-0 nova_compute[117514]: 2025-10-08 19:21:36.881 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:41 compute-0 nova_compute[117514]: 2025-10-08 19:21:41.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:41 compute-0 podman[156295]: 2025-10-08 19:21:41.666985364 +0000 UTC m=+0.078466479 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible)
Oct  8 19:21:41 compute-0 nova_compute[117514]: 2025-10-08 19:21:41.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:44.242 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:21:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:44.242 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:21:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:21:44.243 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:21:46 compute-0 nova_compute[117514]: 2025-10-08 19:21:46.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:46 compute-0 nova_compute[117514]: 2025-10-08 19:21:46.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:48 compute-0 podman[156317]: 2025-10-08 19:21:48.661300582 +0000 UTC m=+0.078406127 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  8 19:21:48 compute-0 podman[156318]: 2025-10-08 19:21:48.679601799 +0000 UTC m=+0.082822604 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:21:48 compute-0 podman[156316]: 2025-10-08 19:21:48.695839206 +0000 UTC m=+0.114585968 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 19:21:51 compute-0 nova_compute[117514]: 2025-10-08 19:21:51.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:51 compute-0 podman[156380]: 2025-10-08 19:21:51.688456625 +0000 UTC m=+0.089289261 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  8 19:21:51 compute-0 podman[156378]: 2025-10-08 19:21:51.692160011 +0000 UTC m=+0.112413365 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 19:21:51 compute-0 podman[156379]: 2025-10-08 19:21:51.704074924 +0000 UTC m=+0.118342906 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 19:21:51 compute-0 nova_compute[117514]: 2025-10-08 19:21:51.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:56 compute-0 nova_compute[117514]: 2025-10-08 19:21:56.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:21:57 compute-0 nova_compute[117514]: 2025-10-08 19:21:57.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:01 compute-0 nova_compute[117514]: 2025-10-08 19:22:01.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:02 compute-0 nova_compute[117514]: 2025-10-08 19:22:02.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:03 compute-0 podman[156440]: 2025-10-08 19:22:03.659518857 +0000 UTC m=+0.079614022 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:22:06 compute-0 nova_compute[117514]: 2025-10-08 19:22:06.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:07 compute-0 nova_compute[117514]: 2025-10-08 19:22:07.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.247 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.249 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:08 compute-0 ceilometer_agent_compute[128303]: 2025-10-08 19:22:08.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  8 19:22:11 compute-0 nova_compute[117514]: 2025-10-08 19:22:11.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:12 compute-0 nova_compute[117514]: 2025-10-08 19:22:12.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:12 compute-0 podman[156465]: 2025-10-08 19:22:12.668979658 +0000 UTC m=+0.093848662 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 19:22:14 compute-0 nova_compute[117514]: 2025-10-08 19:22:14.736 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:14 compute-0 nova_compute[117514]: 2025-10-08 19:22:14.737 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.753 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.754 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.754 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.754 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.993 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.995 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6069MB free_disk=73.40876770019531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.995 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:22:15 compute-0 nova_compute[117514]: 2025-10-08 19:22:15.996 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.070 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.071 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.106 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.122 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.125 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.125 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:22:16 compute-0 nova_compute[117514]: 2025-10-08 19:22:16.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:17 compute-0 nova_compute[117514]: 2025-10-08 19:22:17.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:18 compute-0 nova_compute[117514]: 2025-10-08 19:22:18.126 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:18 compute-0 nova_compute[117514]: 2025-10-08 19:22:18.127 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:22:19 compute-0 podman[156485]: 2025-10-08 19:22:19.670322078 +0000 UTC m=+0.081499076 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  8 19:22:19 compute-0 podman[156487]: 2025-10-08 19:22:19.681551971 +0000 UTC m=+0.080415825 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  8 19:22:19 compute-0 podman[156486]: 2025-10-08 19:22:19.685094053 +0000 UTC m=+0.089113825 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  8 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.718 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.718 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.741 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.741 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:19 compute-0 nova_compute[117514]: 2025-10-08 19:22:19.742 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:20 compute-0 nova_compute[117514]: 2025-10-08 19:22:20.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:21 compute-0 nova_compute[117514]: 2025-10-08 19:22:21.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:22 compute-0 nova_compute[117514]: 2025-10-08 19:22:22.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:22 compute-0 podman[156542]: 2025-10-08 19:22:22.672479609 +0000 UTC m=+0.083874224 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  8 19:22:22 compute-0 podman[156544]: 2025-10-08 19:22:22.672931062 +0000 UTC m=+0.079468068 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Oct  8 19:22:22 compute-0 podman[156543]: 2025-10-08 19:22:22.714023824 +0000 UTC m=+0.125826841 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  8 19:22:23 compute-0 nova_compute[117514]: 2025-10-08 19:22:23.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:22:26 compute-0 nova_compute[117514]: 2025-10-08 19:22:26.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:27 compute-0 nova_compute[117514]: 2025-10-08 19:22:27.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:31 compute-0 nova_compute[117514]: 2025-10-08 19:22:31.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:32 compute-0 nova_compute[117514]: 2025-10-08 19:22:32.192 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:34 compute-0 podman[156608]: 2025-10-08 19:22:34.652120299 +0000 UTC m=+0.068415889 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  8 19:22:36 compute-0 nova_compute[117514]: 2025-10-08 19:22:36.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:37 compute-0 nova_compute[117514]: 2025-10-08 19:22:37.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:41 compute-0 nova_compute[117514]: 2025-10-08 19:22:41.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:42 compute-0 nova_compute[117514]: 2025-10-08 19:22:42.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:43 compute-0 podman[156633]: 2025-10-08 19:22:43.700023764 +0000 UTC m=+0.113170817 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  8 19:22:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:22:44.244 28643 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:22:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:22:44.244 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:22:44 compute-0 ovn_metadata_agent[28637]: 2025-10-08 19:22:44.244 28643 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:22:46 compute-0 nova_compute[117514]: 2025-10-08 19:22:46.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:47 compute-0 nova_compute[117514]: 2025-10-08 19:22:47.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:50 compute-0 podman[156657]: 2025-10-08 19:22:50.635969733 +0000 UTC m=+0.047371474 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:22:50 compute-0 podman[156655]: 2025-10-08 19:22:50.671069533 +0000 UTC m=+0.087567641 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  8 19:22:50 compute-0 podman[156656]: 2025-10-08 19:22:50.682187963 +0000 UTC m=+0.086644754 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=multipathd)
Oct  8 19:22:51 compute-0 nova_compute[117514]: 2025-10-08 19:22:51.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:52 compute-0 nova_compute[117514]: 2025-10-08 19:22:52.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:53 compute-0 podman[156715]: 2025-10-08 19:22:53.648632596 +0000 UTC m=+0.071560720 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=iscsid, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:22:53 compute-0 podman[156717]: 2025-10-08 19:22:53.6894248 +0000 UTC m=+0.087855669 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  8 19:22:53 compute-0 podman[156716]: 2025-10-08 19:22:53.69500029 +0000 UTC m=+0.104020564 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  8 19:22:56 compute-0 nova_compute[117514]: 2025-10-08 19:22:56.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:22:57 compute-0 nova_compute[117514]: 2025-10-08 19:22:57.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:01 compute-0 nova_compute[117514]: 2025-10-08 19:23:01.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:02 compute-0 nova_compute[117514]: 2025-10-08 19:23:02.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:05 compute-0 podman[156779]: 2025-10-08 19:23:05.644317677 +0000 UTC m=+0.065252588 container health_status 9241e280bf74d56c87c03bba80039861809b71e399333ed2f5ba1ad571a75f6d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  8 19:23:06 compute-0 nova_compute[117514]: 2025-10-08 19:23:06.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:07 compute-0 nova_compute[117514]: 2025-10-08 19:23:07.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:11 compute-0 nova_compute[117514]: 2025-10-08 19:23:11.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:12 compute-0 nova_compute[117514]: 2025-10-08 19:23:12.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:12 compute-0 systemd[1]: Created slice User Slice of UID 1000.
Oct  8 19:23:12 compute-0 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  8 19:23:12 compute-0 systemd-logind[844]: New session 16 of user zuul.
Oct  8 19:23:12 compute-0 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  8 19:23:12 compute-0 systemd[1]: Starting User Manager for UID 1000...
Oct  8 19:23:12 compute-0 systemd[156808]: Queued start job for default target Main User Target.
Oct  8 19:23:12 compute-0 systemd[156808]: Created slice User Application Slice.
Oct  8 19:23:12 compute-0 systemd[156808]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  8 19:23:12 compute-0 systemd[156808]: Started Daily Cleanup of User's Temporary Directories.
Oct  8 19:23:12 compute-0 systemd[156808]: Reached target Paths.
Oct  8 19:23:12 compute-0 systemd[156808]: Reached target Timers.
Oct  8 19:23:12 compute-0 systemd[156808]: Starting D-Bus User Message Bus Socket...
Oct  8 19:23:12 compute-0 systemd[156808]: Starting Create User's Volatile Files and Directories...
Oct  8 19:23:12 compute-0 systemd[156808]: Listening on D-Bus User Message Bus Socket.
Oct  8 19:23:12 compute-0 systemd[156808]: Reached target Sockets.
Oct  8 19:23:12 compute-0 systemd[156808]: Finished Create User's Volatile Files and Directories.
Oct  8 19:23:12 compute-0 systemd[156808]: Reached target Basic System.
Oct  8 19:23:12 compute-0 systemd[156808]: Reached target Main User Target.
Oct  8 19:23:12 compute-0 systemd[156808]: Startup finished in 173ms.
Oct  8 19:23:12 compute-0 systemd[1]: Started User Manager for UID 1000.
Oct  8 19:23:12 compute-0 systemd[1]: Started Session 16 of User zuul.
Oct  8 19:23:14 compute-0 podman[156859]: 2025-10-08 19:23:14.098099148 +0000 UTC m=+0.111297123 container health_status e7a0917b9031f20c9f098d110a2e107bed2bdde8b7d131b07ad2e14fad52df7f (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:5d4fdf424fad33a3650163e9e7423f92e97de3305508c2b7c6435822e0313189', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible)
Oct  8 19:23:15 compute-0 nova_compute[117514]: 2025-10-08 19:23:15.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.713 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.750 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.750 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.751 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.751 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.937 2 WARNING nova.virt.libvirt.driver [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.939 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5960MB free_disk=73.40849304199219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.939 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.939 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.997 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  8 19:23:16 compute-0 nova_compute[117514]: 2025-10-08 19:23:16.997 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  8 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.019 2 DEBUG nova.compute.provider_tree [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed in ProviderTree for provider: 8dadd82c-8ff0-43f1-888f-64abe8b5e349 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  8 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.031 2 DEBUG nova.scheduler.client.report [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Inventory has not changed for provider 8dadd82c-8ff0-43f1-888f-64abe8b5e349 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  8 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.032 2 DEBUG nova.compute.resource_tracker [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  8 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.033 2 DEBUG oslo_concurrency.lockutils [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  8 19:23:17 compute-0 nova_compute[117514]: 2025-10-08 19:23:17.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:17 compute-0 ovs-vsctl[157017]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  8 19:23:18 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  8 19:23:18 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  8 19:23:18 compute-0 virtqemud[117415]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  8 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.033 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.035 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  8 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.719 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  8 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.719 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  8 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.879 2 DEBUG nova.compute.manager [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  8 19:23:19 compute-0 nova_compute[117514]: 2025-10-08 19:23:19.880 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:20 compute-0 nova_compute[117514]: 2025-10-08 19:23:20.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:21 compute-0 nova_compute[117514]: 2025-10-08 19:23:21.451 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:21 compute-0 podman[157523]: 2025-10-08 19:23:21.661374526 +0000 UTC m=+0.071389645 container health_status 58848f5576129addd6d9ade070b778586f0de7606faec29e854638436baf10c2 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm)
Oct  8 19:23:21 compute-0 podman[157525]: 2025-10-08 19:23:21.685894281 +0000 UTC m=+0.100363868 container health_status 62ae6cd434398647d3a3fcdd1dd56e40a850c574059b17b38949c5b58f3e220d (image=quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd@sha256:02d33f59749441cd5751c319e9d7cff97ab1004844c0e992650d340c6e8fbf43', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  8 19:23:21 compute-0 podman[157526]: 2025-10-08 19:23:21.690850994 +0000 UTC m=+0.102382867 container health_status 9ee6ab5a54a1c15f0dae58e1273e834dc05ec757b181f0e771054d307cc87213 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  8 19:23:21 compute-0 nova_compute[117514]: 2025-10-08 19:23:21.716 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:22 compute-0 nova_compute[117514]: 2025-10-08 19:23:22.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  8 19:23:22 compute-0 systemd[1]: Starting Hostname Service...
Oct  8 19:23:22 compute-0 systemd[1]: Started Hostname Service.
Oct  8 19:23:22 compute-0 nova_compute[117514]: 2025-10-08 19:23:22.712 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  8 19:23:23 compute-0 podman[157664]: 2025-10-08 19:23:23.807235779 +0000 UTC m=+0.107008760 container health_status 3086ceafce06c06792d80996f1b97d3e86dee59b17705cffd5d091f0ced30845 (image=quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid@sha256:261e76f60c6bc6b172dc3608504552c63e83358a4fa3c0952a671544d83aa83f', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  8 19:23:23 compute-0 podman[157689]: 2025-10-08 19:23:23.923073072 +0000 UTC m=+0.065663500 container health_status 80e02794d1f670087f039440c3295da4f8901866227981034211b6f18f18cb2a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:c3e651f35b930bcf1a3084be8910c2f3f34d22a976c5379cf518a68d9994bfa7', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  8 19:23:23 compute-0 podman[157687]: 2025-10-08 19:23:23.986744985 +0000 UTC m=+0.136145669 container health_status 4329c364dfc7511f11bd7adf36903ba4240c842579fd6f2298ac43622cc0fc59 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:d76f7d6620930cc2e9ac070492bbeb525f83ce5ff4947463e3784bf1ce04a857', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  8 19:23:24 compute-0 nova_compute[117514]: 2025-10-08 19:23:24.717 2 DEBUG oslo_service.periodic_task [None req-85140711-bfc1-484a-91e9-49a5019cf087 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
